home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
InfoMagic Standards 1994 January
/
InfoMagic Standards - January 1994.iso
/
inet
/
nren
/
hpca
/
hearing.txt
< prev
next >
Wrap
Text File
|
1991-04-17
|
219KB
|
3,963 lines
Statement
of the
American Library Association
to the
Subcommittee on Science, Technology, and Space
Senate Committee on Commerce, Science, and Transportation
for the hearing record of March 5, 1991
on
S. 272 The High-Performance Computing Act of 1991
The National Research and Education Network, which S. 272 would
create, could revolutionize the conduct of research, education, and
information transfer. As part of the infrastructure supporting
education and research, libraries are already stakeholders in the
evolution to a networked society. For this reason, the American
Library Association, a nonprofit educational organization of more
than 51,000 librarians, educators, information scientists, and library
trustees and friends of libraries, endorsed in January 1990 and again
in January 1991 the concept of a National Research and Education
Network.
ALA's latest resolution, a copy of which is attached, identified
elements which should be incorporated in legislation to create the
NREN, a high-capacity electronic highway of interconnected networks
linking business, industry, government, and the education and
library communities. ALA also joined with 19 other education,
library, and computing organizations and associations in a
Partnership for the National Research and Education Network. On
January 25, 1991, the Partnership organizations recommended a
policy framework for the NREN which also identified elements to be
incorporated in NREN legislation.
Within that framework, ALA recommends the following additions
to the pending NREN legislation to facilitate the provision of the
information resources users will expect on the network, to provide
appropriate and widely dispersed points of user access, and to
leverage the federal investment.
NREN authorizing legislation should provide for:
A. Recognition of education in its broadest sense as a reason for
development of the NREN;
B. Eligibility of all types of libraries to link to the NREN as
resource providers and as access points for users; and
C. A voice for involved constituencies, including libraries, in
development of network policy and technical standards.
NREN legislation should authorize support for:
A. High-capacity network connections with all 50 states;
B. A percentage of network development funds allocated for
education and training; and
C. Direct connections to the NREN for at least 200 key libraries and
library organizations and dial-up access for multitype libraries
within each state to those key libraries. Prime candidates (some of
which are already connected to the Internet) for direct connection to
the NREN include:
- The three national libraries (Library of Congress, National
Agricultural Library, National Library of Medicine) and other federal
agency libraries and information centers;
- Fifty-one regional depository libraries (generally one per
state) which have a responsibility to provide free public access to all
publications (including in electronic formats) of U.S. government
agencies;
- Fifty-one state library agencies (or their designated resource
libraries or library networks) which have responsibility for
statewide library development and which administer federal funds;
- Libraries in geographic areas which have a scarcity of NREN
connections;
- Libraries with specialized or unique resources of national or
international significance; and
- Library networks and bibliographic utilities which act on
behalf of libraries.
The National Science Foundation, through its various programs,
including science education, should provide for:
A. The inclusion of libraries both within and outside of higher
education and elementary and secondary education as part of the
research and education support structure;
B. Education and training in network use at all levels of education;
and
C. Experimentation and demonstrations in network applications.
ALA enthusiastically supports development of an NREN with
strong library involvement for several reasons.
1. The NREN has the potential to revolutionize the conduct of
research, education, and information transfer. As basic literacy
becomes more of a problem in the United States, the skills needed to
be truly literate grow more sophisticated. ALA calls this higher set of
skills "information literacy"-knowing how to learn, knowing how to
find and use information, knowing how knowledge is organized.
Libraries play a role in developing these skills, beginning with
encouraging preschool children to read.
Libraries as community institutions and as part of educational
institutions introduce users to technology. Many preschoolers and
their grandparents have used a personal computer for the first time
at a public library. Libraries are using technology, not only to
organize their in-house collections, but to share knowledge of those
collections with users of other libraries, and to provide users with
access to other library resources, distant databases, and actual
documents. Libraries have begun a historic shift from providing
access primarily to the books on the shelves to providing access to
the needed information wherever it may be located. The NREN is the
vehicle librarians need to accelerate this trend.
In Michigan, a pilot program called M-Link has made librarians
at a group of community libraries full, mainstream information
providers. Since 1988, M-Link has enabled libraries in Alpena, Bay
County, Hancock, Battle Creek, Farmington, Grand Rapids, and Lapeer
to have access to the extensive resources of the University of
Michigan Library via the state's MERIT network. The varied requests
of dentists, bankers, city managers, small business people,
community arts organizations, and a range of other users are
transmitted to the University's librarians via telephone, fax, or
computer and modem. Information can be faxed quickly to the local
libraries from the University. Access to a fully developed NREN
would increase by several magnitudes both the amount and types of
information available and the efficiency of such library
interconnections. Eventually, the NREN could stimulate the type of
network that would be available to all these people directly.
School libraries also need electronic access to distant resources
for students and teachers. In information-age schools linked to a
fully developed NREN, teachers would work consistently with
librarians, media resource people, and instructional designers to
provide interactive student learning projects. Use of multiple sources
of information helps students develop the critical thinking skills
needed by employers and needed to function in a democratic society.
This vision of an information-age school builds on today's
groundwork. For instance, the New York State Library is providing
dial-up access for school systems to link the resources of the state
library (a major research resource) and more than 50 public,
reference, and research library systems across the state. The schools
had a demonstrated need for improved access for research and other
difficult-to-locate materials for students, faculty, and administrators.
2. Current Internet users want library-like services, and libraries
have responded with everything from online catalogs to electronic
journals. As universities and colleges became connected to the
Internet, the campus library's online catalog was one of the first
information resources faculty and students demanded to have
available over the same network. Some 200 library online catalogs
are already accessible through the Internet. Academic library users
increasingly need full text databases and multimedia and
personalized information resources in an environment in which the
meter is not ticking by the minute logged, the citation downloaded,
or the statistic retrieved. A telecommunications vehicle such as the
NREN can help equalize the availability of research resources for
scholars in all types, sizes, and locations of higher education
institutions.
Libraries will be looked to for many of the information
resources expected to be made available over the network, and
librarians have much to contribute to the daunting task of organizing
the increasing volumes of electronic information. The Colorado
Alliance of Research Libraries, a consortium of multitype libraries,
not only lists what books are available in member libraries, but its
CARL/Uncover database includes tables of contents from thousands
of journals in these libraries. Libraries are also pioneering in the
development of electronic journals. Of the ten scholarly refereed
electronic journals now in operation or in the planning stages, several
are sponsored by university libraries or library organizations.
3. Libraries provide access points for users without an
Institutional base. Many industrial and independent researchers do
not have an institutional connection to the Internet. All such
researchers and scholars are legitimate users of at least one public
library. The NREN legislation as introduced does not reflect current
use of the networks, much less the full potential for support of
research and education. Because access to Internet resources is
necessary to this goal, many libraries outside academe without access
to academic networks have developed creative, if sometimes
awkward, ways to fill the gap. A number of high schools have guest
accounts at universities, but only a few have managed to get direct
connections. CARL, the Colorado Alliance of Research Libraries,
reaches library users regardless of the type of library they are using
or their point of access. The development of community computer
systems such as the Cleveland Free-net is another example of
providing network access to a larger community of library users.
Several Cleveland area public, academic, and special libraries are
information providers on the Free-net as well.
Most of the companies in California high-technology centers
either began as or still have fewer than 50 employees. For these
companies, there is no major research facility or corporate library.
The local public libraries provide strong support as research
resources for such companies. The California State Library has
encouraged and supported such development, for example, through
grants to projects like the Silicon Valley Information Center in the
San Jose Public Library. Library access to the NREN would improve
libraries' ability to serve the needs of small business.
Support of research and education needs in rural areas could
also be aided through library access to the NREN. Even without such
access, libraries are moving to provide information electronically
throughout their states, often through state networks. An example is
the North Carolina Information Network. NCIN, through an agreement
between the State Library and the University of North Carolina's
Educational Computing Service, provides information access to almost
400 libraries in every part of the state-from university and
corporate libraries in the Research Triangle Park, to rural mountain
and coastal public libraries, to military base libraries. Using federal
Library Services and Construction Act funds, the State Library
provides the local equipment needed at the packet nodes to permit
access to the system (called LINCNET) to these local libraries.
The information needs of rural people and communities are just
as sophisticated and important as the needs of the people in urban
areas. Because the North Carolina network is available in rural
libraries, small businesses in these communities have access for the
first time to a state database of all contracts for goods, services, and
construction being put out for bid by the state-just one example of
network contribution to economic development. The key to the
network's growing success is the installation of basic computer and
telecommunications hardware in the libraries, access to higher speed
data telecommunications, and the database searching skills of the
librarians.
4. With libraries and their networks, the support structure to
make good use of the NREN already exists. Librarians have been
involved in using computers and telecommunications to solve
information problems since the 1960s when the library community
automated variable-length and complex records-a task which was
not being done by the computer field at the time. Librarians
pioneered in the development of standards so that thousands of
libraries could all use the same bibliographic databases, unlike e-
mail systems today which each require a different mode of address.
The library profession has a strong public service orientation and a
cooperative spirit; its codes of behavior fit well with that of the
academic research community.
Libraries have organized networks to share resources, pool
purchasing power, and make the most efficient use of
telecommunications capacity and technical expertise. Upgrading of
technological equipment and technological retraining are recognized
library requirements, although the resources to follow through are
often inadequate. The retraining extends to library users as well.
Librarians are familiar with the phenomenon of the home computer
or VCR purchaser who can word process or play a tape, but is all
thumbs when it comes to higher functions not used every day.
Computer systems, networks, and databases can seem formidable to
the novice and are often not user-friendly. Expert help at the library
is essential for many users.
5. NREN development should build on existing federal investments
in the sharing of library and information resources and the
dissemination of government information. The Internet/NREN
networks are in some cases not technically compatible with current
library networking arrangements. However, the government or
university database or individual expert most appropriate to an
inquiry may well be available only via the Internet/NREN. Access to
specific information resources and the potential linkage to scarce
human resources is one reason why most librarians are likely to
need at least some access to the NREN.
As the Internet/NREN is used by various federal agencies, it
becomes a logical vehicle for the dissemination of federal
government databases. The Government Printing Office, through its
Depository Library Program, has begun providing access to
government information in electronic formats, including online
databases. A unified government information infrastructure
accessible through depository libraries would enable all sectors of
society to use effectively the extensive data that is collected and
disseminated by the federal government. Disseminating time-
sensitive documents electronically would allow all citizens, small
businesses, and nonprofit groups to have real-time access to
government information through an existing organized system of
depository libraries. The 51 regional libraries (generally one in each
state, many of which are university and other libraries already
connected to the Internet) could provide the original nodes for such a
system. Together with major libraries capable of providing such
support, these libraries could provide access for smaller libraries and
selective depositories within their states or regions through dial-up
facilities or local area networks.
The library community has been assisted and encouraged in its
networking efforts by the federal government beginning in the
1960s, and more recently by state support also, in ways that track
well with the NREN model. The federal government spends in the
neighbor- hood of $200 million per year on programs which promote
and support interlibrary cooperation and resource sharing and
library applications of new technology. These programs range from
the Library Services and Construction Act, the Higher Education Act
title II, the Depository Library Program, the library postal rate, and
the Medical Library Assistance Act to programs of the three national
libraries-the Library of Congress, the National Agricultural Library,
and the National Library of Medicine.
If academic libraries continue their migration to the
Internet/NREN as the network of choice both on campus and for
communication with other academic institutions, it will not be
long before academic libraries and public libraries find themselves
unable to talk to one another electronically. This result will be totally
at odds with the goals of every major legislative vehicle through
which the federal government assists libraries. In addition, it makes
no sense, given the intimate connection of public libraries to the
support structure for research and education. While public libraries
have long been recognized as engines of lifelong learning, the
connection is much more direct in many cases, ranging from the
magnificent research resources of a New York Public Library to the
strong support for distance learning provided by many public
libraries in Western states.
Interlibrary loan and reference referral patterns also show that
every kind of library supports every other's mission. The academic,
public, school, state, national, and specialized libraries of the nation
constitute a loose but highly interconnected system. A network
which supports research and education, or even research alone,
cannot accomplish the job without including this multitype system of
libraries in planning, policy formulation, and implementation.
6. The NREN's higher seeds will enable the sharing of full text and
nontextual library and archival resources. Libraries will increasingly
need the higher capacity of the NREN to exploit fully library special
collections and archives. The high data rates available over the fully
developed NREN will make possible the transmission of images of
journal articles, patents, sound and video clips, photos, artwork,
manuscripts, large files from satellite data collection archives,
engineering and architectural design, and medical image databases.
Work has already begun at the national libraries and elsewhere;
examples include the Library of Congress American Memory project
and the National Agricultural Library text digitizing project.
7. Libraries provide a useful laboratory for exploration of what
services and what user interfaces might stimulate a mass
marketplace. One purpose of the NREN bills since the beginning has
been to promote eventual privatization of the network. Libraries
have already demonstrated the feasibility and marketability of
databases in the CD-ROM format. Libraries also convinced proprietors
and distributors to accommodate the mounting on local campus
systems of heavily used databases. Libraries can serve as middle- to
low-end network use test beds in their role as intermediaries
between the public and its information requirements.
8. Public, school, and college libraries are appropriate institutions
to bridge the growing gap between the information poor and the
information rich. While we pursue information literacy for all the
population, we can make realistic progress through appropriate
public service institutions such as libraries. However, while an
increase in commercial services would be welcome, any transition to
privatization should not come at the expense of low-cost
communications for education and libraries. Ongoing efforts such as
federal library and education legislation, preferential postal rates for
educational and library use, and federal and state supported library
and education networks provide ample precedent for continued
congressional attention to own and inexpensive access.
In conclusion, the NREN legislation would be strengthened in
reaching the potential of the network, in ALA's view, with the
addition of the elements we have enumerated above. Our
recommendations represent recognition of the substantial
investment libraries have already made in the Internet and in the
provision of resources available over it, authorization of modest and
affordable near-term steps to build on that base for library
involvement in the NREN, and establishment of a framework for
compatible efforts through other federal legislation, and state and
local library efforts.
ATTACHMENT
WASHINGTON OFFICE
American Library Association
110 Maryland Avenue, N.E.
Washington, D.C. 20002
(202) 547-4440
Resolution on a National Research and Education Network
WHEREAS, The American Library Association endorsed the concept
of a National Research and Education Network in a Resolution passed
by its Council (1989-90 CD #54) on January 10, 1990; and
WHEREAS, Legislation to authorize the development of a National
Research and Education Network has not yet been enacted; and
WHEREAS, High-capacity electronic communications is increasingly
vital to research, innovation, education, and information literacy; and
WHEREAS, Development of a National Research and Education
Network is a significant infrastructure investment requiring a
partnership of federal, state, local, institutional, and private-sector
efforts; and
WHEREAS, Libraries linked to the National Research and Education
Network would spread its benefit more broadly, enhance the
resources to be made available over it, and increase access to those
resources; now, therefore, be it
RESOLVED, That the American Library Association reaffirm its
support of a National Research and Education Network, and
recommend incorporation of the following elements in NREN
legislation:
- Recognition of education in its broadest sense as a reason for
development of the NREN;
- Eligibility of all types of libraries to link to the NREN as
resource providers and as access points for users;
- A voice for involved constituencies, including libraries, in
development of network policy and technical standards;
- High-capacity network connections with all 50 states and
territories;
- Federal matching and other forms of assistance (including
through other federal programs) to state and local education and
library agencies, institutions, and organizations.
Adopted by the Council of the American Library Association
Chicago, Illinois
January 16, 1991
(Council Document #40)
Executive Offices: 50 East Huron Street, Chicago, Illinois 60611
(312) 944-6780
ASSOCIATION OF RESEARCH LIBRARIES
1527 New Hampshire Avenue, N.W, Washington, DC. 20026
(202) 232-2466 FAX (202) 462-7849
Statement of the Association of Research Libraries
to the
Subcommittee on Science, Technology, and Space
Senate Committee on Commerce, Science and Transportation
for the Hearing Record of March 5, 1991
on S. 272 - The High-Performance Computing Act of 1991
The Association of Research Libraries is a non-profit Association of
119 research libraries in North America. The membership of ARL is
actively involved in the provision of information resources -
including those that are unique, to the research and education
communities of North America. Research libraries also are key
participants in numerous experiments and pilot programs that
demonstrate the utility of high capacity networks for the exchange
and use of information. ARL supports the passage of legislation that
will promote the development and use of expanded networking
capacities and capabilities to advance education and research.
The need for a high-speed computer communications network is
a reflection of a number of changes underway in the academic and
library communities. Three of these changes include the need to
connect researchers with facilities such as supercomputers,
databases, and library resources; the changing manner in which
scholars and researchers communicate; and finally, the ability of
these researchers to manipulate and combine large data sets or files
in new ways only possible through connecting users with high-speed,
high-capacity networks.
The NREN, the vision of the next generation network designed to
support -the work of the education and research communities -
must reflect the changes noted above as well as those efforts already
underway that address the new uses of information, while at the
same time, address the national goals of improving our Nation's
productivity and international competitive position. To realize these
goals and to build upon existing efforts, ARL with others in the
education community support the inclusion of the following points in
NREN legislation. These points build upon existing successful federal,
state, and local programs that facilitate access to information
resources.
NREN authorizing legislation should provide for:
- Recognition of education in its broadest sense as a reason for
development of the NREN;
- Eligibility of all types of libraries to link to the NREN as
resource providers and as access points for users;
- A voice for involved constituencies, including libraries, in
development of network policy and technical standards.
NREN legislation should authorize support for:
- High capacity network connections with all 50 states;
- A percentage of network development funds should be
allocated for education and training;
- Direct connections to the NREN for at least 200 key libraries
and library organizations and dial-up access for multi-type libraries
within each state to those key libraries. Prime candidates for direct
connections include:
*The three national libraries (Library of Congress, National
Agricultural Library, National Library of Medicine) and other federal
agency libraries and
information centers;
*51 regional depository libraries (generally one per state)
which have a responsibility to provide free public access to all
publications (including in electronic formats) of U.S. government
agencies;
*51 state library agencies (or their designated resource
libraries or library networks) which have responsibility for
statewide library development and which administer federal funds;
*Libraries in geographic areas which have a scarcity of
NREN connections;
*Libraries with specialized or unique resources of national
or international significance;
*Library networks and bibliographic utilities which act on
behalf of libraries.
The National Science Foundation, through its various programs,
including science education, should provide for:
- The inclusion of libraries both within and outside of higher
education and elementary/secondary education as part of the
research and education support structure;
- Education and training in network use at all levels of
education;
Experimentation and demonstrations in network applications.
The information infrastructure of the United States is a complex
conglomeration of public and private networks, institutions,
information resources, and users from educational, research, library,
and industrial communities with extensive ties to international
networks and infrastructures. Research libraries and the resources
that they acquire, organize, maintain, and/or provide access to, are
critical elements of this infrastructure. In support of their mission to
advance scholarship and research, these same libraries have been at
the forefront of the technological revolution that has made this
robust and evolving information infrastructure possible.
One of the most exciting and unanticipated results of the
NSFNET has been the explosive growth of the network as a
communications link. The enhanced connectivity permits scholars
and researchers to communicate in new and different ways and
stimulates innovation. Approximately one-quarter of the use of
NSFNET is for E-mail, one-quarter for file exchange, 20% for
interactive applications, and 30% for associated services. It is this
latter category that is growing at an extraordinary rate and includes
new and innovative library uses of networks. This growth rate
demonstrates the value that researchers place on access to library
and information resources in support of education and research. The
following examples demonstrate the types of activities underway in
academic and research libraries that utilize networks.
In the past year, the number of library online catalogs available
on the Internet has jumped from thirty to over 160, including those
in Canada, Australia, Germany, Mexico, New Zealand, Israel, and the
United Kingdom. A single point of access to 100 online public access
catalogs is possible today through a midwestern university. Access to
resources identified in online public access catalogs are of increasing
importance to researchers as they can access a greatly expanded
array of information resources and in a more timely and efficient
fashion. Needed information can be located at another institution,
and depending upon the nature and format of the information,
downloaded directly, and/or requested via interlibrary loan. Over
time, this practice will likely change to the researcher obtaining the
information directly online versus "ordering the information online."
Typical use of an online catalog at a major research institution is that
of LIAS at the Pennsylvania State University Library - there are
approximately 33,000 searches each day of the LIAS system.
The National Agricultural Library, NAL, is supporting a project
with the North Carolina State University Libraries to provide
Internet-based document delivery for library materials. Scanned
images of documents generate machine readable texts which are
transmitted via the NSFNET/Internet to libraries, researchers work
stations, and agricultural research extension offices. Images of
documents can be delivered directly to the researchers computer,
placed on diskette, or printed. This program will be extended to the
entire land- grant community of over 100 institutions as well as to
other federal agencies and to the international agricultural research
community.
Another example of new library services that are possible with
the use of the information technologies and networks, that meet a
growing demand in the research community, and represent a
network growth area are the licensing of commercial journal
databases by libraries. Four of the last five years of the National
Library of Medicine's MEDLINE database is accessible to the
University of California community and there are approximately
50,000 searches of the system each week. There are numerous
benefits to researchers and libraries including enhanced access to
journal literature, there are lower costs to the library than from use
of commercial systems, and the lower costs encourages greater use of
the files by researchers thus promoting innovation. As other research
libraries mount files, similar use patterns have occurred.
Although Internet access to proprietary files is not permitted,
there are other services available such as UNCOVER that are more
widely accessible. UNCOVER is a database with the tables of contents
for approximately 10,000 multi-disciplinary journals developed by
the Colorado Alliance of Research Libraries. The increasing demand
for UNCOVER demonstrates the need for such services in the
academic community and one that is available at a low cost for those
institutions unable to locally mount proprietary files.
One area of networked services forecast to present new
opportunities for dissemination and exchange of information in the
scholarly and research communities and where a significant amount
of experimentation and "rethinking" is anticipated, is in electronic
publishing. Publishing electronically is in its infancy. Today, there are
ten refereed journals on the Internet and it is anticipated that there
will be many times this number in a short while. These journals,
available via the Internet, range from Postmodern Culture, (North
Carolina State University) to New Horizons in Adult Education,
(Syracuse University) to PSYCOLOQUY, (American Psychological
Association and Princeton University).
The nature and format of the electronic journal is evolving. To
some, the electronic journal is a substitute to the "printed" journal.
There are an increasing number of "paper- replicating electronic
journals" and the growing number of titles on CD-ROM and the rapid
rate of acceptance of this format, is a testament to the value of the
electronic format. It is anticipated that many of the paper publishers
will offer an electronic version of their journals via intermediaries
such as DIALOG and CARL as the use of and capabilities of networks
expand. This model also presents new dissemination choices to
government agencies. The National Agricultural Library has begun to
negotiate agreements with scholarly societies for the optical scanning
of agricultural titles and information.
Another view of the electronic journal is one more of process,
than product. Information or an idea is disseminated on the network
for open critique, comment, dialog, and exchange. In this instance,
publishing is an ongoing, interactive, non-static function, and one
that encourages creativity, connectivity, and interactivity.
Researchers experimenting in this camp are referred to as
"skywriters" or "trailblazers." In fact, publishing in this arena takes
on a new meaning due to the network's capabilities. The use of
multi-media including sound, text, and graphics, the significantly
expanded collaborative nature of the scholarly exchange not possible
with a printed scholarly publication, and finally, the potential for a
continuously changing information source, distinguishes this
electronic journal from its counterpart, the paper-replicating
electronic journal. An online publishing program on the Genome
Project at the Welch Library at Johns Hopkins University is an
example of this type of electronic publishing. Text is mounted on a
database, accessed by geneticists, students, and critics who respond
directly via electronic mail to the author. In this case, a computerized
textbook is the end result but one which constantly changes to reflect
new advances in the field. Funding from the National Library of
Medicine has supported this project.
A final area where electronic publishing activities are underway
is in the academic publishing community. Two examples of activities
include efforts in the high energy physics and mathematics
communities. A preprint database in high energy physics has been
maintained for fifteen years by a university research facility with
approximately 200 preprints added each week to the database of
over 200,000 article citations. Instant Math Preprints (IMP), a new
initiative that will maintain a searchable database of abstracts, will
permit electronic file transfer of the full text of preprints. The project
will be accessible via ten universities and "e-math," the American
Mathematical Society's electronic service. The value to the research
community of timely and effective exchange of research results will
be enormous.
There are two predominant reasons that pilot projects and
experiments such as these have been possible, have flourished, and
been successful. First, a high value has been placed and a significant
investment has been made in carefully constructed cooperative
programs in the library community to advance research through the
sharing of resources. The creation and support of bibliographic
utilities such as the Research Libraries Information Network (RLIN)
and the Online Computer Library Center (OCLC) has resulted in access
by scholars to enormous databases of bibliographic records and
information. Cooperative programs have been supported and
encouraged by federal programs such as the Library Services and
Construction Act of 1964 and the Higher Education Act of 1965. The
Higher Education Act and in particular Title II-C and Title II-D
programs have emphasized the sharing of resources between all
types of libraries and users, and provided needed funds for support
of technological innovations and developments. These programs have
also promoted equality of access to information, ensuring that those
collections housed in major research institutions, be broadly
accessible.
The second reason that libraries have succeeded in advancing
the exchange of information resources is the effective use of
technologies to promote access. Most, if not all of these cooperative
programs, are dependent upon networks in part, as the means to
identify and share information resources. What will be required as
more resources become available through the Internet will be the
development of network directories. These directories will assist
users in learning of what resources are available and how to access
them. Provision of these electronic resources and the development of
the ensuing access tools such as directories are already presenting
many challenges to library and information science professionals and
will require continuing attention if the NREN is to succeed.
As a consequence, the needed infrastructure to connect a
diversity of users to a wide array of information resources is in place
today. Networks interconnecting information resources and users
throughout all parts of the United States and internationally, have
been operational and effective for a number of years. A key factor
that will permit the NREN to be a success is that much of the
infrastructure is already in place. There are networks that
interconnect academic institutions - public and private, industrial
users, and state consortiums, that include library networks and that
do not distinguish between rural and urban, academic and K-12. The
NREN vision must continue to encourage and demand enhanced
interconnectivity between all users and all types of institutions.
As Congress considers how to best design the NREN to meet the
needs of the research and academic communities, it will be important
more than ever to include the goals and objectives of ongoing
programs. In a time when there are 1,000 books published
internationally each day, 9,600 different journals are published
annually in the United States, the total of all printed knowledge is
doubling every eight years, electronic information is just beginning
to be exploited, and financial and funding resources are shrinking, it
is critical that the research and education communities with
continued federal support, strive for increased connectivity between
all types of libraries and users. This connectivity will result in
improved productivity and a strengthening of U.S. position in the
international marketplace.
S. 272 should provide the necessary framework to achieve this
enhanced connectivity. S.272 should build upon existing programs
and identify new means to permit information resources to be
broadly available to the education and research communities.
Ensuring connectivity through multiple types of libraries, throughout
the United States, is a critical component to several existing statutes
and should be included in NREN legislation. By so doing, the
legislation would leverage existing federal, state, and local programs.
As libraries and users alike employ information technologies to
access information resources, new opportunities and applications will
develop that exploit the wealth of information and knowledge
available in research libraries. Network applications today primarily
focus on the provision of access to resources such as books, journals,
and online files. Electronic publishing ventures are just beginning. In
the years ahead, scholars and researchers will be able to access and
use those research materials and collections generally unaccessible
but of extreme research value including photographs, satellite data,
archival data, videos and movies, sound recordings, slides of
paintings and other artifacts, and more. Access to and manipulation
of these information resources advances scholarship and research,
and scholars will expect a network with the capacity and capabilities
to achieve effective access. Clearly, to be successful, effective, and of
use to the academic and research communities, the NREN must be
designed to nurture and accommodate both the current as well as
future yet unknown uses of these valuable information resources.
United States General Accounting Office
Testimony
GAO
Supercomputing in Industry
For Release on Delivery
Expected at 2:00 p.m. EST Tuesday, March 5, 1991
Statement for the record by
Jack L. Brock, Jr.,
Director Government Information and Financial Management Issues
Information Management and Technology Division
Before the Subcommittee on Science, Technology, and Space
Committee on Commerce, Science, and Transportation
United States Senate
GA/T-IMTEC-91-3
Messrs. Chairman and Members of the Committee and Subcommittee:
I am pleased to submit this statement for the record, as part of the
Committee's hearing on the proposed High Performance Computing
Act of 1991. The information contained in this statement reflects the
work that GAO has conducted to date on its review of how industries
are using supercomputers to improve productivity, reduce costs, and
develop new products. At your request, this work has focused on
four specific industries--oil, aerospace, automobile, and
pharmaceutical/chemical--and was limited to determining how these
industries use supercomputers and to citing reported benefits.
We developed this material through an extensive review of
published documents and through interviews with knowledgeable
representatives within the selected industries. In some cases, our
access to proprietary information was restricted. Since this statement
for the record reports on work still in progress, it may not fully
characterize industry use of supercomputers, or the full benefits
likely to accrue from such use.
BACKGROUND
A supercomputer, by its most basic definition, is the most powerful
computer available at a given time. While the term supercomputer
does not refer to a particular design or type of computer, the basic
design philosophy emphases vector or parallel processing,
[Footnote 1: Vector processing provides the capability of operating on
arrays, or vectors, of information simultaneously. With parallel
processing, multiple parts of a program are executed concurrently.
Massively parallel supercomputers are currently defined as those
having over 1,000 processors.]
aimed at achieving high levels of calculation very rapidly. Current
supercomputers, ranging in cost from $1 million to $30 million, are
capable of performing hundreds of millions or even billions of
calculations each second. Computations requiring many hours or days
on more conventional computers may be accomplished in a few
minutes or seconds on a supercomputer.
The unique computational power of supercomputers makes it
possible to find solutions to critical scientific and engineering
problems that cannot be dealt with satisfactorily by theoretical,
analytical, or experimental means. Scientists and engineers in many
fields-- including aerospace, petroleum exploration, automobile
design and testing, chemistry, materials science, and electronics--
emphasize the value of supercomputers in solving complex problems.
Much of this work centers around scientific visualization, a technique
allowing researchers to plot masses of raw data in three dimensions
to create visual images of objects or systems under study. This
enables researchers to model abstract data, allowing them to "see"
and thus comprehend more readily what the data reveal.
While still relatively limited in use, the number of supercomputers
has risen dramatically over the last decade. In the early l980s, most
of the 20 to 30 supercomputers in existence were operated by
government agencies for such purposes as weapons research and
weather modeling. Today about 280 supercomputers
[Footnote 2: This figure includes only high-end supercomputers such
as those manufactured by Cray Research, Inc. Including International
Business Machines (IBM) mainframes with vector facilities would
about double this number.]
are in use worldwide. Government (including defense-related
industry) remains the largest user, although private industry has
been the fastest growing user segment for the past few years and is
projected to remain so.
The industries we are examining enjoy a reputation for using
supercomputers to solve complex problems for which solutions might
otherwise be unattainable. Additionally, they represent the largest
group of supercomputer users. Over one-half of the 280
supercomputers in operation are being used for oil exploration;
aerospace modeling, testing, and development; automotive testing
and design; and chemical and pharmaceutical applications.
THE OIL INDUSTRY
The oil industry uses supercomputers to better determine the
location of oil reservoirs and to maximize the recovery of oil from
those reservoirs. Such applications have become increasingly
important because of the low probability of discovering large oil
fields in the continental United States. New oil fields are often small,
hard to find, and located in harsh environments making exploration
and production difficult. The oil industry uses two key
supercomputer applications, seismic data processing and reservoir
simulation, to aid in oil exploration and production. These
applications have saved money and increased oil production.
Seismic data processing increases the probability of determining
where oil reservoirs are located by analyzing large volumes of
seismic data
[Footnote 3: Seismic data are gathered by using sound-recording
devices to measure the speed at which vibrations travel through the
earth.]
and producing two and three- dimensional images of subsurface
geology. Through the study of these images, geologists can better
understand the characteristics of the area, and determine the
probability of oil being present. More accurately locating oil
reservoirs is important because the average cost of drilling a well is
estimated at about $5.5 million and can reach as high as $50 million.
Under the best of circumstances, most test wells do not result in
enough oil to make drilling cost-effective. Thus, avoiding drilling one
dry well can save millions of dollars. The industry representatives
who agreed to share cost estimates with us said that supercomputer
use in seismic data processing reduces the number of dry wells
drilled by about 10 percent, at a savings of hundreds of millions of
dollars over the last 5 years.
Reservoir simulation is used to increase the amount of oil that can be
extracted from a reservoir. Petroleum reservoirs are accumulations
of oil, water, and gas within the pores of rocks, located up to several
miles beneath the earth's surface. Reservoir modeling predicts the
flow of fluids in a reservoir so geologists can better determine how
oil should be extracted. Atlantic Richfield and Company (ARCO)
representatives estimate that reservoir simulation used for the oil
field at Prudhoe Bay, Alaska--the largest in production in the United
States--has resulted in increased oil production worth billions of
dollars.
THE AEROSPACE INDUSTRY
Engineers and researchers also use supercomputers to design,
develop, and test aerospace vehicles and related equipment. In
particular, computational fluid dynamics, which is dependent upon
supercomputing, enables engineers to simulate the flow of air and
fluid around proposed design shapes and then modify designs
accordingly. The simulations performed using this application are
valuable in eliminating some of the traditional wind tunnel tests
used in evaluating the aerodynamics of airplanes. Wind tunnels are
expensive to build and maintain, require costly construction of
physical models, and cannot reliably detect certain airflow
phenomena. Supercomputer-based design has thus resulted in
significant time and cost savings, as well as better designs, for the
aerospace industry.
Lockheed Aerospace used computational fluid dynamics on a
supercomputer to develop a computer model of the Advanced
Tactical Fighter for the U.S. Air Force. By using this approach,
Lockheed was able to display a full-vehicle computer model of the
fighter after approximately 5 hours of supercomputer processing
time. This approach allowed Lockheed to reduce the amount of wind-
tunnel testing by 80 hours, resulting in savings of about half a
million dollars.
The Boeing Aircraft Company used a Cray 1S-2000 supercomputer to
redesign the 17-year old 737-200 aircraft in the early 1980s. Aiming
to create a more fuel-efficient plane, Boeing decided to make the
body design longer and replace the engines with larger but more
efficient models. To determine the appropriate placement of these
new engines, Boeing used the supercomputer to simulate a wind-
tunnel test. The results of this simulation--which were much more
detailed than would have been available from an actual wind-tunnel
test--allowed the engineers to solve the engine placement problem
and create a more fuel-efficient aircraft.
THE AUTOMOBILE INDUSTRY
Automobile manufacturers have been using supercomputers
increasingly since 1985 as a design tool to make cars safer, lighter,
more economical, and better built. Further, the use of
supercomputers has allowed the automobile industry to achieve
these design improvements at significant savings.
One supercomputer application receiving increasing interest is
automobile crash-simulation. To meet federally mandated crash-
worthiness requirements, the automobile industry crashes large
numbers of pre-prototype vehicles head-on at 30 miles per hour into
rigid barriers. Vehicles for such tests can cost from $225,000 to
$750,000 each. Crash simulation using supercomputers provides
more precise engineering information, however, than is typically
available from actually crashing vehicles. In addition, using
supercomputers to perform this type of structural analysis reduces
the number of actual crash tests required by 20 to 30 percent, saving
the companies millions of dollars each year. Simulations such as this
were not practical prior to the development of vector
supercomputing because of the volume and complexity of data
involved.
Automobile companies credit supercomputers with improving
automobile design in other ways as well. For example, Chrysler
Corporation engineers use linear analysis and weight optimization
software on a Cray X-MP supercomputer to improve the design of its
vehicles. The resulting designs--which, according to a Chrysler
representative, would not have been practical without a
supercomputer--will allow Chrysler to achieve an annual reduction
of about $3 million in the cost of raw materials for manufacturing its
automobiles. In addition, one automobile's body was made 10
percent more rigid (which will improve ride and handling) and 11
percent lighter (which will improve fuel efficiency). According to the
Chrysler representative, this is typical of improvements that are
being achieved through the use of its supercomputer.
THE CHEMICAL AND PHARMACEUTICAL INDUSTRIES
Supercomputers play a growing role in the chemical and
pharmaceutical industries, although their use is still in its infancy.
From computer-assisted molecular design to synthetic materials
research, companies in these fields increasingly rely on
supercomputers to study critical design parameters and more
quickly and accurately interpret and refine experimental results.
Industry representative told us that, as a result, the use of
supercomputing will result in new discoveries that may not have
been possible otherwise.
The pharmaceutical industry is beginning to use supercomputers as a
research tool in developing new drugs. Development of a new drug
may require up to 30,000 compounds being synthesized and
screened, at a cost of about $5,000 per synthesis. As such, up to $150
million, before clinical testing and other costs, may he invested in
discovering a new drug, according to an E.I. du Pont de Nemours and
Company representative. Scientists can now eliminate some of this
testing by using simulation on a supercomputer. The supercomputer
analyzes and interprets complex data obtained from experimental
measurements. Then, using workstations, scientists can construct
three-dimensional models of the large, complex human proteins and
enzymes on the computer screen and rotate these images to gain
clues regarding biological activity and reactions to various potential
drugs.
Computer simulations are also being used in the chemical industry to
replace or enhance more traditional laboratory measurements. Du
Pont is currently working to develop replacements for
chlorofluorocarbons, compounds used as coolants for refrigerators
and air conditioners, and as cleansing agents for electronic
equipment. These compounds are generally thought to contribute to
the ozone depletion of the atmosphere and are being phased out. Du
Pont is designing a new process to produce substitute compounds in
a safe and cost- effective manner. These substitutes will be more
reactive in the atmosphere and subject to faster decomposition. Du
Pont is using a supercomputer to calculate the thermodynamic data
needed for developing the process. These calculations can be
completed by the supercomputer in a matter of days, at an
approximate cost of $2,000 to $5,000. Previously, such tests--using
experimental measurements conducted in a laboratory--would
require up to 3 months to conduct, at a cost of about $50,000. Both
the cost and time required would substantially limit the amount of
testing done.
BARRIERS TO GREATER USE OF SUPERCOMPUTERS
These examples demonstrate the significant advantages in terms of
cost savings, product improvements, and competitive opportunity
that can he realized through supercomputer use. However, such use
is still concentrated in only a few industries. Our industry contacts
identified significant, interrelated barriers that individually or
collectively, limit more widespread use of supercomputers.
Cost. Supercomputers are expensive. A supercomputer's cost of
between $1 million and $30 million does not include the cost of
software development, maintenance, or trained staff.
Cultural resistance. Simulation on supercomputers can not only
reduce the physical testing, measurement, and experimentation, but
can provide information that cannot otherwise be attained. For many
scientists and managers this represents a dramatic break with past
training, experience, generally accepted methods, or common
doctrine. For some, such a major shift in research methodology is
difficult to accept. These new methods are simply resisted or ignored.
Lack of application software. Supercomputers can be difficult to use.
For many industry applications, reliable software has not yet been
developed. This is particularly true for massively parallel
supercomputers.
Lack of trained scientists in supercomputing. Between 1970 and
1985, university students and professors performed little of their
research on supercomputers. For 15 years, industry hired students
from universities who did not bring supercomputing skills and
attitudes into their jobs. Now, as a result, many high-level scientists,
engineers, and managers in industry have little or no knowledge of
supercomputing.
In conclusion, our work to date suggests that the use of
supercomputers has made substantial contributions in key U.S.
industries. While our statement has referred to benefits related to
cost reduction and time savings, we believe that supercomputers will
increasingly be used to gain substantive competitive advantage.
Supercomputers offer the potential--still largely untapped--to
develop new and better products more quickly. This potential is just
beginning to be explored, as are ways around the barriers that
prevent supercomputers from being more fully exploited.
EXECUTIVE OFFICE OF THE PRESIDENT
OFFICE OF SCIENCE AND TECHNOLOGY POLICY
WASHINGTON, D.C. 20506
HIGH PERFORMANCE COMPUTING AND COMMUNICATIONS
TESTIMONY
OF
D. ALLAN BROMLEY
DIRECTOR
OFFICE OF SCIENCE AND TECHNOLOGY POLICY
BEFORE THE
SUBCOMMITTEE ON SCIENCE, TECHNOLOGY, AND SPACE
COMMITTEE ON COMMERCE, SCIENCE, AND TRANSPORTATION
U.S. SENATE
MARCH 5, 1991
Mr. Chairman and members of the Committee:
Thank you for giving me the opportunity, as Director of the Office of
Science and Technology Policy, to discuss with you the critically
important issue of high performance computing and communications.
On February 4, 1991, the President announced his proposed budget
for Fiscal year 1992. Among the major new R&D programs in the
budget is a Presidential initiative on high performance computing
and communications, which is described in the report Grand
Challenges: High Performance Computing and Communications. The
report, which was released on February 5, 1991, was produced by a
Working Group on High Performance Computing and Communications
under the Committee on Physical, Mathematical, and Engineering
Sciences, which is one of seven umbrella interagency committees
under the Federal Coordinating Council for Science, Engineering, and
Technology (FCCSET). A copy of the report is attached.
The overall goals of the high performance computing and
communications initiative are symbolized by a set of what are called
"grand challenges," problems of important scientific and social value
whose solution could he advanced by applying high performance
computing techniques and resources. These include global climate
modeling, mapping the human genome, understanding the nature of
new materials, problems applicable to national security needs, and
the design of ever more sophisticated computers. Many such
problems can be addressed through high performance computing and
communications, including ones that are impossible to foresee today.
The initiative represents a full integration of component programs in
a number of Federal agencies in high performance computing and
computer communications networks. It integrates and coordinates
agency programs and builds on those programs where appropriate.
The initiative proposes to increase funding in these programs by 30
percent, from the $489 million appropriated in FY 1991 to $638
millions in FY 1992.
History of the Initiative
The high performance computing and communications initiative can
trace its formative years to the early 1980s, when the scientific
community and federal agencies recognized the need for advanced
computing in a wide range of scientific disciplines. As fields of
science progressed, the quantity of data, the number of databases,
and need for more sophisticated modeling and analysis all grew. The
Lax Report of 1982 provided an opportunity to open discussions on
the need for supercomputer centers beyond those previously at the
Department of Energy's national laboratories. Subsequently, the
availability of such resources to the basic research community
expanded -. for example, through the establishment of the National
Science Foundation's and NASA's supercomputing centers.
In 1982 a FCCSET committee examined the status of supercomputing
in the United States and reviewed the role of the federal government
in the development of this technology. In 1985 this committee
recommended government action necessary to retain technological
supremacy in the development and use of supercomputers in the
United States. Subsequent planning resulted in a series of workshops
conducted in 1987 and in a set of reports that set forth a research
and development strategy.
A synthesis of the studies, reports, and planning was published by
OSTP in the report entitled The Federal High Performance Computing
Program. which was issued on September 8, 1989. The initiative in
the FY 1992 budget represents an implementation by the
participating agencies of the plan embodied in that report,
appropriately updated to recognize accomplishments made to date.
The report described a five-year program to be undertaken by four
agencies -- the Defense Advanced Research Projects Agency, the
National Science Foundation, the Department of Energy, and the
National Aeronautics and Space Administration. Four additional
partners have since joined the program -- the National Library of
Medicine within the National Institutes of Health, the Environmental
Protection Agency, and the National Institute of Standards and
Technology and National Oceanic and Atmospheric Administration
within the Department of Commerce - and they have added
considerable strength to the overall program.
The planning and implementation of the HPCC program have been
the result of extraordinarily effective collaboration by the
participating agencies using the FCCSET forum. It was developed
alter several years of discussions among the agencies and hundreds
of hours of negotiating and interactions between all federal
government agencies with an interest in computing. Agencies have
realigned and enhanced their HPCC programs, coordinated their
activities with other agencies, and shared common resources. The
final product represents a complex balance of relationships and
agreements forged among the agencies over a number of years.
These agencies have achieved a level of mutual trust, cooperation,
and synergism that is remarkable in or out of government -- and not
easily achieved. In addition, the success of this effort demonstrates
the advantages to be gained by using the FCCSET process to
coordinate areas of science and technology that cut across the
missions of several federal agencies. The FCCSET interagency process
maintains the necessary flexibility and balance of a truly integrated
program as the science and technology evolve, and it allows
additional agencies to identity opportunities and participate in a
given program.
Description of the Initiative
The HPCC initiative is a program for research and development in all
leading-edge areas of computing. The program has four major
components: (1) High Performance Computing Systems, (2)
Advanced Software Technology and Algorithms, (3) a National
Research and Education Network (NREN), and (4) Basic Research and
Human Resources. The program seeks a proper balance among the
generic goals of technology development, technology dissemination
and application, and improvements in U.S. productivity and
industrial competitiveness. It incorporates general purpose
advanced computing as well as the challenges ahead in massively
parallel computing.
In the development of computing hardware, ambitious goals have
been set. The program seeks a thousandfold improvement in useful
computing capability (to a trillion operations per second). The focus
will be on the generic technologies that will prove valuable in many
different sectors. Where appropriate, projects will be performed on a
cost-shared basis with industry.
In software development, the program will focus on the advanced
software and algorithms that in many applications have become the
determining factor for exploiting high performance computing and
communications. In particular, software must become much more
user-friendly if we are to provide a much larger fraction of the
population with access to high performance computing.
The National Research and Education Network (NREN) would
dramatically expand and enhance the capabilities of the existing
interconnected computer networks called the Internet. The overall
goal is to achieve a hundredfold increase in communications speed
(to levels of gigabits per second). In addition, the number of "on-
ramps" and "off-ramps" to the network would he greatly expanded,
bringing the potential of high performance computing to homes,
offices, classrooms, and factories. Such a network could have the
kind of catalytic effect on our society, companies, and universities
that the telephone system has had during the twentieth century. A
new meaning will be given to communication, involving not just the
transfer of knowledge but a full sharing of resources and capabilities
that no single site possesses.
Finally, the HPCC initiative will add significantly to the nation's
science and technology infrastructure through its impacts on
education and basic research. It is my personal view that the
successful implementation of this program will lay the foundation for
changes in education at all levels, including the precollege level.
Of course, no plan is better than its execution, and the execution of
the HPCC initiative will rely heavily on the synergy that has been
carefully cultivated among the participating agencies. This synergy
has been fostered by allowing each agency to do what it does best in
the way that it does best. Each of the four founding agencies has
national constituencies and historical strengths. DARPA, for example,
will lead in fostering the development of breakthrough system
technologies, as it has done in the past for time-sharing, network
operating systems, and RISC architecture. DOE, through its historical
ties with the national laboratories, has always led in the
development and use of HPCC technologies and is applying them on
the cutting-edge of scientific problems. NASA will continue to pursue
a new wave of space-related and aeronautics problems, such as
computational aerodynamics, as well as its strength in the collection,
modeling, simulating, and archiving of space-based environmental
data. And NSF's close ties with the academic community gives it a
special expertise in both education and in the coordination and use of
NREN.
Expected Returns of the Initiative
The high performance computing and communications initiative
represents a major strategic investment for the nation with both
economic and social returns. I personally believe that few
technology initiatives have the potential to have a greater impact on
the ways we live and work than does the high performance
computing and communications initiative.
The high-performance end of the computer market is relatively
small, but its influence far transcends its size. The high end is where
leading-edge technologies and applications are developed. Recent
history indicates that these developments diffuse so quickly
throughout the overall market that "superminis" and
"superworkstations" are no longer contradictions in terms. A federal
investment in the leading-edge computing technology will speed the
growth of the overall computer market and may catalyze
investments on the part of U.S. industry. At the same time,
supercomputers are not the only important hardware component; we
shall not forget the importance of the smaller, more widely
distributed units and their role in the overall system.
In addition, the HPCC initiative will he a major contributor to meeting
national needs. National security, health, transportation, education,
energy, and environment concerns are all areas that have grown to
depend on high performance computing and communications in
essential ways. The dependence will grow as computers become
more powerful, cheaper, more reliable, and more usable.
HPCC is also critical for the nation's scientific infrastructure. The
electronic computer was born as a scientific tool, and its early
development was driven by scientific needs. Business applications
soon came to dominate its development, but recently there has been
a renewed focus on computers as an instrument in science. Indeed,
"computational science," which incorporates modeling, simulation and
data rendition, is adding a third dimension to experimentation and
theory as modes of scientific investigation. In field after field of
fundamental and applied sciences, problems intractable for either
theory or experimentation are being successfully attacked with the
aid of high speed computation.
Diffusion of the Initiative's Benefits
If the HPCC initiative is to realize its full potential, it is not enough
that it reach its technology goals. It is equally important that the
technologies be deployed by the private sector in a timely way to
result in an acceleration of market growth. It is likewise insufficient
for applications to be developed and problems to be solved; in
addition, the benefits accruing from those solutions must be
disseminated so as to influence our everyday lives.
The continued development and use of government-funded high
performance computing and communications prototypes can have a
significant positive impact on the potential commercialization of
these technologies. In addition, many organizations that cannot
individually justify the hardware investments will be able to gain
access to these new computing systems via the new network Thus,
the knowledge gained through the timely development and use of
prototype systems and the access provided to them by the network
will significantly improve the dissemination of the benefits of the
initiative.
However, this wide diffusion is not possible by federal action alone.
The Administration's HPCC initiative will serve the nation best as a
catalyst for private actions. Some analysts have suggested that the
HPCC initiative can spur several hundred billion dollars of GNP
growth. If so, it will be because American companies, both large and
small, are able to deploy the technologies in producing quality goods
and services.
Similarly, some predict that NREN will lead to the establishment of a
truly national high speed network that connects essentially every
home and every office. If that happens, it will be because private
investments are stimulated by government leadership. Far from
suppressing or displacing the focus of a free market, the HPCC
initiative will strengthen them by providing the impetus for vigorous
private action.
Congressional Initiatives in High Performance Computing and
Communications
The breadth and balance of the high performance computing and
communications initiative are critical to its success. The four
components of the program are carefully balanced, and maintaining
this balance is the most important priority in the program. For
example, powerful computers without adequate software,
networking, and capable people would not result in successful
applications. A program that created only high performance
networks would not satisfy the need for greater computing
performance to take advantage of the networks and solve important
problems.
Similarly, the Administration's initiative relies on substantial
participation by industry and government laboratories to overcome
barriers to technology transfer. Cooperative government, industry,
and university activities will yield the maximum benefits derived
from moving new technologies from basic discoveries to the
marketplace.
The legislative proposals pending before the Congress, though well
intended, do not fully recognize the comprehensive interagency
effort brought about through years of collaboration. For example, S.
272 only specifies the program for two of the four major agencies
included in the high performance computing and communications
initiative. In addition, S. 272 incorrectly specifies the roles of the
agencies; many of the requirements of the legislation have, in fact,
already been accomplished; and the agencies have moved on to
further scientific and technical challenges. The legislation, in effect,
may detract from the existing programs by limiting the activities of
the agencies and by causing an unintended revision of complex
relationships forged between the agencies. For these reasons, I
strongly believe that FCCSET activities should not be codified in law.
I am concerned that legislative action not limit the flexibility of what
is by nature an extremely dynamic process. When research plans are
developed to implement interagency programs, those plans are
inevitably dynamic, just as the research efforts they describe are
dynamic and evolving. If research plans are codified in law, it
suggests that the research is static. This is particularly a concern with
high performance computing and communications, where the pace of
technological change is dramatic. As an example of a fast-moving
research opportunity, I might mention a joint Los Alamos National
Laboratory/DARPA effort that successfully applied an innovative
massively parallel Connection Machine Computer system to a nuclear
weapons safety code to gain new and valuable insights into the
safety of the nuclear weapons inventory. Another example occurred
in the last year at the National Library of Medicine's National Center
for Biotechnology Information, where researchers developed a new
fast algorithm for sequence similarity searches of protein and nucleic
acid databases. This was very helpful in the identification of a gene
causing von Rocklinghausen's neurofibromatosis. This is a major
breakthrough in the understanding of this bewildering disorder that
affects about 1 in 3,000 people. On the networking front, significant
achievements have also been made. For example, the NSFNET has
increased in speed a thousandfold (from 56 kilobits per second to 45
megabits per second) since 1988.
S. 272 has as its focal point the issuing of a plan that would delineate
agency roles and include specific tasks. However, the
Administration's initiative and the accompanying FCCSET report
satisfy these demands for items to be incorporated in the planning
phase. S. 272 further calls for the establishment of an advisory
panel to provide additional input into the plan. But many of the
agencies already have current advisory panels, and private sector
participation is fully anticipated in the Administration's initiative as
agency programs move forward to implementation. Moreover, the
oversight role of the Congress, including the hearings scheduled this
week in the House and Senate, serve as important elements in the
fine tuning of the program.
The National Research and Education Network described in the
initiative addresses the need for greatly enhanced computer
communications highlighted in the legislation. The initiative also
seeks to be comprehensive in addressing the roles of the various R&D
agencies -- for example, by allowing other agencies to join the effort
as appropriate.
It bears emphasis that the Administration's initiative uses the
existing statutory, programmatic, budgetary, and authorizing
authorities of the agencies and departments involved in the
initiative, including OSTP. The funding levels necessary to proceed
with this effort have been transmitted to the Congress in the
President's request and are clearly reflected in the budgets of each of
the eight agencies involved in the initiative. The Congress already
has the ability to positively affect the high performance computing
program of the federal government through existing authorizations
and appropriations.
FCCSET is a very important mechanism within the Executive Branch
for reviewing and coordinating research and development activities
that cut across the missions of more than one federal agency. Unlike
the committees in the Legislative Branch, each of which has discrete
authority for oversight, interagency committees within FCCSET are
forums for discussion, analysis, collaboration, and consensus building.
The member agencies then have the responsibility for implementing
the program and proceeding with the necessary contracting,
budgeting, and so on developed through the interagency process.
Several legislative vehicles, in addition to S. 272, have been
introduced that seek to endorse and advance the Administration's
initiative. I welcome the Congress's interest and intentions in high
performance computing and communications. I am confident that by
working together we can have a significant impact on the nation's
future through these efforts, and I welcome suggestions from
Congress to improve the current initiative.
I might suggest that hearings to receive the views of all the various
communities involved with this proposal and a positive endorsement
of this program by Congress would be of great assistance in
advancing high performance computing and communications in this
country. Positive action on the requested appropriations will ensure
that this extensive interagency program can go forward.
Mr. Chairman and members of the committee, let me conclude by
saying that I look forward to working cooperatively with you on this
initiative. We share the same goals, and I am confident that we can
reach a consensus on how best to achieve them.
CONVEX COMPUTER CORPORATION
WRITTEN STATEMENT
Presented to
U.S. Senate
Commerce, Science and Transportation
Subcommittee on Science, Technology and Space
CONVEX supports S. 272, the High-Performance Computing Act of
1991, as we believe it will assist U.S. industry in maintaining
leadership in computing technology. We strongly believe this
legislation can positively contribute to one of the biggest threats
facing the United States today: the loss of our international
competitiveness in all technology related businesses. In addition, it
will directly stimulate the supercomputing industry.
Europe and Japan have targeted information technologies for
particular attention, and unless decisive steps are taken to ensure
our continued leadership, the U.S. could be surpassed in a technology
field that we largely pioneered and which is vital to our economic
future.
The real American competitiveness question involves making our
nation's industries competitive. The use of supercomputers is
mandatory to maintaining America's competitive edge in all of our
key industries, such as aerospace, automotive, electronics,
pharmaceuticals, petroleum, etc. -- not just in supercomputing
manufacturing.
We believe the actions called for in S. 272 -- particularly the
acceleration of the development of computer systems and
subsystems, the stimulation of research on software technology, and
the application of high-performance computing to "Grand Challenges"
- - are not only appropriate goals, but vital to maintaining the U.S.
lead in supercomputers and utilizing supercomputer technology in
our high-tech industries and research.
Supercomputers are the fundamental building blocks that contribute
to almost all disciplines across the broadest spectrum of science and
technology. In the 1990's, the way America can stay competitive is
literally to put supercomputing in the hands of the "masses."
Supercomputers are to the modern technologist what the invention
of the microscope was to biologists and the telescope was to
astronomers. In fact, supercomputers enable scientists and
engineers to solve problems for things that are too small, too large,
too quick, too slow, or too dangerous to observe directly. This use in
industry results in new products that are more innovative, safer, and
get to market more quickly. Their use in research results in
fundamental breakthroughs in science that change how we see the
world. The supercomputer is the one common tool across all U.S.
scientific and technological activities that, if put in the hands of
engineers and scientists throughout the United States, can
dramatically sharpen the competitive output of the United States.
Of course, Japanese industry and research institutions totally
understand and believe these concepts. From our perspective, they
have been the fastest nation to purchase CONVEX's latest technology.
Until just recently, there were more of CONVEX's top- of-the-line
supercomputers in Japan than in the United States. American
researchers and engineers believe these concepts also, but access to
supercomputer tools has been limited. S. 272 can be the catalyst to
change this trend.
CONVEX's assessment of the competitive position of the high-
performance computer industry in the U.S. relative to that of Japan is
as follows:
The high-performance computer market is an international market
in which Cray dominates the high-end of the market, and CONVEX
dominates the mid-range market. The Japanese computer
manufacturers, NEC, Fujitsu, and Hitachi, have high performance, fast
hardware products. But while this is the case, U.S. high performance
computer companies currently maintain the lead in supercomputing
for the following reason: supercomputing is not about hardware, it's
about solving complex problems. The U.S. supercomputer companies
are ahead of foreign competition because we understand there are
aspects to supercomputing solutions:
o Balanced, high-performance hardware: There is more to real
performance than pure megaflops or gigaflops performance.
Unfortunately, that' s how performance is commonly measured but
these definitions must be properly interpreted. There is much more
to useful performance than peak speed, such as software
performance, memory performance, and 1/0 performance. Users
care only about the performance of their applications -- the
problems they specifically solve with their machines -- and this type
of performance is determined by dozens o attributes. In terms of
speed, the Japanese have high peak performance, but that's only a
part of the supercomputing solution.
o Software technology -- Operating systems (UNIX) and
compilers: Maintaining the lead requires being proficient at
several software standards. Companies such as CONVEX and Cray
recognized the emergence of the UNIX standard long ago and
designed their machines for UNIX -- now considered a requirement
in supercomputing. Japanese systems have historically been based on
IBM standards and only now are attempting to migrate to UNIX. Also
superior compiler technology is critical to computing performance
and productivity. American companies and research institutions lead
in this areas, as well.
o Application specific software: Most of the supercomputers
in use today, especially in industry, utilize third-party written
software applications rather than custom-written software
applications. The majority of that third-party software is developed
by U.S. based organizations. CONVEX considers having both a broad
array of application software available on its machines and having
agreements/relationships with the software developers, as critical
elements of its competitive strategy and success. American suppliers
are leading in this crucial area.
o Service and support -- taking care of the customer: This is a
critical component in supercomputing solutions. American companies'
reputations in the area of service and support are superior.
American suppliers utilize direct sales and support organizations in
all major markets and, as such, are closer to the customer. Outside of
Japan, Japanese manufacturers typically use distributors or OEMs for
sales and customer support.
It would be naive to believe that U.S. companies will always be able
to maintain the supercomputer lead for the reasons cited above
without continual development and diligence in these areas. The
Japanese can -- and will, in time -- develop these necessary
strengths. Although CONVEX has been selling its supercomputers
successfully to the Japanese for almost six years now, we also realize
that when, or if, the Japanese companies decide that the
price/performance market niche that CONVEX currently dominates is
a viable and sizable market for Japan, the competitiveness threat
posed by Japan can become very serious.
The biggest threats posed by the Japanese to American
supercomputer companies are:
o The size of the big three Japanese companies is over $89
billion, which provides substantial financial staying power. This
gives them the ability to mask the success or lack of success of their
supercomputer products versus U.S. supercomputer companies,
whose existence relies solely on the success of their supercomputers.
o Furthermore, they can afford to not be profitable in the
supercomputer market segment for a very long period of time and
can buy market share by excessive and unreasonable discounting,
while public U.S. companies are forced to live by quarter to quarter
reporting, which represents the results of a single technology focus.
o The big three Japanese computer companies also dominate the
semiconductor industry, including advanced semiconductor research
and development required to build supercomputers.
o The cost of capital differs substantially for U.S. versus
Japanese companies.
In light of these factors, staying competitive in today's global
supercomputer market will take a concerted effort by American
companies, as well as cooperation and constructive stimulation by
government. Certainly, the High-Performance Computing Act of
1991 will be a positive contribution in this direction.
Comments on the bill. S. 272
General Comments
CONVEX enthusiastically supports this legislation and commends it to
you for your favorable consideration and swift passage in the House.
We fully support the idea of a "National High-Performance
Computing Program." There are several provisions of the bill on
which I would like to comment and highlight.
The High-Performance Computing Advisory Panel
The federal government has played a prominent role in the
American supercomputing success story and S. 272 again
demonstrates this leadership. In several areas of the bill,
cooperation between government and industry is called for to review
progress made in implementing the plan and making necessary
revisions. In particular, the bill calls for the establishment of a High-
Performance Computing Advisory Panel consisting of representatives
from industry and academia to assist with these tasks. I want to
highlight this concept as being extremely important to achieving the
objectives of the bill. The results of the expenditures for equipment
and research called for by the bill must ultimately be the
development of competitively superior commercial products. The
strategic plan that is put into place by this bill should have this as a
fundamental objective. Government is better qualified for some
aspects of the task, and industry is better qualified for others.
Partnership between the two will allow the plan to utilize the best
capabilities of both. CONVEX has exposure to applications, research
and product developments occurring all over the world, and in the
broadest of scientific areas. We volunteer to help in whatever ways
we can.
The National Research and Education Network (NREN)
CONVEX fully supports the bill's provision calling for the creation of a
multi-gigabit-per-second National Research and Education Network
(NREN). It is our perspective that in the past, too much emphasis
was placed on providing limited access to too few centralized
machines. Supercomputing must be made available to, and meet the
needs of, a broad base of users through widely distributed
supercomputer systems placed closer to the ultimate user. This
would not supplant the centralized machines, but rather complement
them.
I suggest that in establishing NREN, it should not only be envisioned
as a multi-gigabit per second backbone network, connecting only a
small number of very high-speed, centralized computer systems.
Let's think of it as a distributed network of computing and
telecommunications services, serving the widest possible number of
scientists and engineers from government. industry and academia.
The National Science Foundation's national supercomputer centers
represent a case in point. The program has been a success, but we
can learn from what those users are additionally asking for:
supercomputing close to the user. Let's supplement and complement
the national supercomputer centers with affordable, open, accessible
supercomputing facilities, available in departments and dedicated to
products across the nation. Let's put a broad range of
supercomputers, distributed data bases, and other research and
production facilities, in the very laps of those who need them to help
maintain and regain America's preeminence in many disciplines.
Software
In the last ten years, only about 300 high-end supercomputers have
been sold by U.S. companies to industry and to research institutions.
From CONVEX alone, over 600 high-performance computing systems
have been shipped in only five years. American industry needs
distributed, affordable supercomputing power to remain competitive.
These companies, large and small, are voting with their checkbooks
for this means of providing supercomputing. They are using
supercomputing in production environments, not just in their
research laboratories. They need supercomputers to bring new and
improved products to market faster. Supercomputers are a valued
competitive weapon for all of these companies.
The full utility of supercomputers can only be reached through
software. The sophisticated supercomputing user community
desperately needs improved software development tools, computer-
assisted software engineering (CASE) capabilities, and better
algorithmic methods. With this improved state-of-the-art software,
we can move forward with attacks on the Grand challenges
enumerated in the bill.
CONVEX wholeheartedly supports the software tasks and goals of the
bill. Care should be taken to ensure that resources are not wasted by
reinventing what may already exist in industry or somewhere in the
world. ~t' s concentrate on improving software technology, but
adhering to industry standards wherever possible, and avoiding
proliferating proprietary solutions to software problems.
Basic Research and Education
CONVEX strongly supports the provisions of the bill in the areas of
basic research and education. Only the largest and richest
corporations can afford to have very much of their resources
dedicated to basic research. Most of the industry, and I count
CONVEX in this group, must use its limited research and development
resources in the development and production of the next generations
of our commercial products. So we need a fertile source of basic
research if the supercomputer industry and the nation are to
progress.
Again, this must be treated as a partnership. We must create
effective, efficient, fast-acting technology transfer mechanisms so
that our basic research can be fully utilized. We. therefore.
recommend that the bill specifically call for the creation of a
separate. responsible Technology Transfer Program Office to insure
that basic research is translated into products to be used to further
all of our goals.
In the area of education, the United States needs a great deal of
assistance to help us remain competitive. The bill's provisions to
educate and train additional undergraduate and graduate students in
software engineering, computer science, and computational science
and to provide researchers, educators, and students with access to
high-performance computing are extremely worthwhile. However,
the intent of the bill should be applied across the board in the
supercomputing industry and should include mechanical engineers,
packaging engineers, chemical engineers and others.
Summary
In summary, I recommend this bill to you. The amount of funding
called for by this bill is indeed small when compared to the
significant economic benefit the program will bring to U.S. industrial
competitiveness. It is essential that the United States remain
aggressive in the area of supercomputer technology. This bill will
combine the resources of U.S. industry, government, and universities
to meet the challenge of foreign competition.
Testimony by
DR. JOHN PATRICK CRECINE
PRESIDENT, GEORGIA INSTITUTE OF TECHNOLOGY
for a Hearing of
THE SENATE COMMITTEE ON COMMERCE, SCIENCE AND
TRANSPORTATION
March 5, 1991
Mr. Chairman, it is an honor to be asked to testify to this joint
hearing on S.R. 272, The High Performance Computing Act of 1991.
I am John P. Crecine, President of the Georgia Institute of Technology.
Georgia Tech is a major technological university, with an enrollment
of approximately 12,000 students, located in Atlanta, Georgia.
Georgia Tech is one of the nation's leading research universities,
having conducted over $175 million in sponsored research during the
past year, almost all in the areas of science, engineering and
technology.
I would like to thank this committee, and especially Senator Gore, for
their continued strong support of computing-related research. I
think the committee's focus on computing in the context of national
competitiveness is an appropriate one, and one that leads to the
anticipation of critical technologies. Georgia Tech strongly supports
S.R. 272, and eagerly awaits possible participation in translating its
objectives into reality.
Georgia Tech, as a major technological university, has placed a high
priority on computing and related facilities. This may be best
demonstrated by the creation in 1989 of the College of Computing,
the nation's first college devoted entirely to computing. Both within
the College of Computing, and throughout the rest of the Institute,
there is a deep and comprehensive involvement with leading-edge
computational science and engineering. For this reason, the activities
proposed under the High Performance Computing Initiative are
eagerly awaited.
The special importance of creating a high-performance computing
network like NREN is its impact not only on computing research
itself, but its creation of a basic "digital infrastructure" for the nation.
Communications, both simple - like a phone dial tone - and
complicated - like HDS - will be dependent on digital networks.
Communications make it possible for the first time to conduct
research and advance scientific frontiers from afar, combining the
parts of experimental setups from around the country instead of
expensively reproducing them in many locations. Equally important
to utilizing this network capability is the complementing parts of the
high performance computing initiative. Thus, the technology of a
digital network like NREN lies at the heart of most future research
efforts in science and engineering.
Specifically, the impact of this legislation on technologically-oriented
educational institutions like Georgia Tech will be multidimensional. I
would like to focus my remarks today on three areas: engineering
education, computer science, and technological applications.
Engineering, and engineering education, is Georgia Tech's "core
business," and stands to benefit greatly from this initiative in high
performance computing. As the role of computing has grown, up-to-
date computing facilities are no longer a luxury, but a necessary,
integral part in engineering education and research. For example, at
the graduate level, we must have the computational facilities that
will enable us to train our students in computer-based science and
engineering techniques, skills industry expects our students to have.
The connectivity in the network already allows our students to use
remote facilities such as telescopes and high-energy research
facilities without the cost and capacity constraints inherent in those
sites. However, an initiative such as this expands exponentially the
opportunities available to them. What NREN does is shift the focus
from physically having a a high-powered and expensive
computational device such as a supercomputer to access to one of
these devices. In the end, this makes for a much more productive
and cost-effective environment for creating and disseminating
knowledge.
The new capabilities given us by the high performance computing
initiative have impressive spin-off effects as well. As more students,
professors and researchers gain access to advance computing, I
predict we will see an impressive array of offshoot, but related,
architectures and systems that will take full advantage of the
capabilities of this network. Once again, this is an issue of national
competitiveness, an area where this initiative gives our universities
and research laboratories the tools with which to compete.
Just as engineering has been traditionally important to Georgia Tech,
we are taking a leadership position in computing with the creation of
our College of Computing. This College of Computing, while not
representing the entire spectrum of computing at Georgia Tech, was
created as a top-level organization to emphasize computing, and
speed the integration of computer science and other disciplines. In
many respects, this organization parallels the objectives of this high
performance computing initiative and NREN. Simply put, high
performance computing is a top priority, one in which we have
invested in and focused on, and is a natural area for a university like
Georgia Tech to concentrate in.
I see a very positive dual flow between the high performance
initiative and our computer science operations. First, many of the
areas we are focusing on, specifically management of large scientific
databases and distributed operating systems for highly parallel
machines, are topics important to the success of the HPC initiative,
and we hope to be able to contribute our expertise in these areas
toward making the initiative a success. We are also forming a
Visualization, Graphics and Usability (VGU) lab under prominent
national leadership to develop better techniques for visualizing
scientific data, an critical component of this proposed network. But
we also envision that the project will benefit computing at Georgia
Tech by adding to our own knowledge and expertise, and should aid
not only Georgia Tech but many other universities nationwide.
The HPCI will have a major positive affect on many areas of basic
computer science research, even in ways that are not directly related
to high performance computing. For example, the visualization
advances I just talked about have applicability to low-performance
computing, and work in user interfaces for all types of computers
could be aided by work done through the high performance project.
The third area where I feel the High Performance Computing Act of
1991 will have a critical impact is in the development of new
technological applications. Georgia Tech is not an "ivory tower" - we
solve some very applied problems, and focus on transferring the
technology developed in our laboratories to the marketplace.
I believe we are on the threshold of a revolution in
telecommunications, a merging of the traditional telecommunications
industry with the computer and broadcast industries, with the
common denominator of a digital network tieing them all together.
This act developments such a network (and the functions that
support and depend on the network), propelling universities into an
integrated communications environment that is a natural test bed for
future communications systems. Other countries have been
furthering this concept, but development in the United States has
been hampered by the regulatory environment and hurdles imposed
by previous paradigms. In this vision, we should view NREN not so
much as a way to link scholars or transfer data, but as an
experimental tool in itself. The network is then a test of its own
capabilities, that is, a test of the capabilities of a digital network, its
speed, volume, and capacity for accommodating different signals. Its
success impacts not only the educational community, but
demonstrates this new model for telecommunications and firmly
establishes a United States lead in these technologies.
In the end, the issue becomes one of educational competitiveness.
Without the resources, opportunities and challenges network-based
computing opens up for our engineers, we would quickly be non-
competitive not only nationally, but internationally. This initiative
lays important groundwork for the the U.S. to regain the initiative in
high-performance computing and to increase our edge in network
technologies.
In closing, I would like to especially express my support for the
administration's multi-year approach to this project. If we are to
undertake a project of this magnitude, a five-year commitment on
the part of the government makes it much easier and more efficient
to both plan for and attract talent to this project. Georgia Tech is
especially supportive of the roles of NSF, NASA and DARPA in
administering this project. Given their prior leadership and track
record in running projects of this scope, it makes eminent good sense
for this triad to lead an initiative as significant as this one.
This is a remarkable opportunity, and I, as President of Georgia Tech,
stand ready, as do many of my colleagues in universities around the
country, to assist in any way possible to make this vision a reality.
STATEMENT OF SENATOR AL GORE
TUESDAY, MARCH 5 HEARING ON S. 272,
THE HIGH-PERFORMANCE COMPUTING ACT OF 1991
Today, the Science Subcommittee is considering S. 272, the High-
Performance Computing Act. This bill will ensure that the United
States stays at the leading edge in computer technology. It would
roughly double the Federal government's investment in research and
development on new supercomputers, more advanced software, and
high-speed computer networks. Most importantly, it would create a
National Research and Education Network, the NREN, which would
connect more than one million people at more than a thousand
colleges, universities, laboratories, and hospitals throughout the
country, giving them access to computing power and information
resources unavailable anywhere today.
These technologies and this network represent our economic
future. They are the smokestack industries of today's Information
Age. We talk a lot now about jobs and economic development; about
pulling our country out of recession and into renewal. Our ability to
meet the economic challenges of the Information Age and beyond --
tough challenges from real competitors around the globe -- will rest
in large measure on our ability to maintain and strengthen an
already threatened lead in these technologies and industries.
I have been advocating legislation such as this for more than one
dozen years because I strongly believe that it is critical for our
country to develop the best scientists, the best science, the fastest,
most powerful computers, and then, to ensure access to these
technologies to as many people as possible so as many people as
possible will benefit from them. This legislation will help us do that.
Every year, there are new advocates. This year, finally, President
Bush is among them, including his budget for Fiscal Year 1992, $149
million in new funding to support these technologies.
We cannot afford to wait or, to put off this challenge. Not if we
care about jobs, economic development, or our ability to hold our
own in world markets.
During the last thirty years, computer technology has improved
exponentially, faster than technology in any other field. Computers
just keep getting faster, more powerful, and more inexpensive.
According to one expert, if automobile technology had improved as
much as computer technology has in recent years, a 1991 Cadillac
would now cruise at 20,000 miles per hour, get 5,000 miles to a
gallon, and cost only three cents!
As a result of these amazing advances, computers have gone
from being expensive, esoteric research tools isolated in the
laboratory to an integral part of our everyday life. We rely on
computers at the supermarket, at the bank, in the office, and in our
schools. They make our life easier in hundreds of ways.
Yet the computer revolution is not over. In fact, according to
some measures, the price-performance ratio of computers is
improving even faster now than it has in the past.
Anyone who has seen a supercomputer in action has a sense of
what computers could do in the future. Today, scientists and
engineers are using supercomputers to design better airplanes,
understand global warming, find oil fields, and discover safer, more
effective drugs. In many cases they can use these machines to mimic
experiments that would be too expensive or downright impossible in
real life. With a supercomputer model, engineers at Ford can
simulate auto crashes and test new safety features for a fraction of
the cost and in a fraction of the time it would take to really crash an
automobile. And they can observe many more variables, in much
more detail, than they could with a real test.
The bill we are considering today is very similar to the first title
of S. 1067, the High-Performance Computing Act of 1990, which
passed the Senate unanimously last October. Unfortunately, the
House was unable to act on the bill before we adjourned.
It is my hope that we will be able to move this bill quickly this
year. There is widespread support in both the House and the Senate.
In the House, Congressman George Brown, the new chairman of the
House Committee on Science, Space, and Technology, has introduced a
very similar bill, H.R. 656, which has been cosponsored by
Congressmen Tim Valentine, Sherwood Boehlert, Norm Mineta, and
others. On Thursday, the Science Committee's Subcommittee on
Science and its Subcommittee on Technology and Competitiveness
will be holding a hearing on the bill. I look forward to working with
my House colleagues to move this bill as quickly as possible.
This legislation provides for a multi-agency high-performance
computing research and development program to be coordinated by
the White House Office of Science and Technology Policy (OSTP),
whose director, Dr. D. Allan Bromley, is our first witness today. The
primary agencies involved are the National Science Foundation (NSF),
the Defense Advanced Research Projects Agency (DARPA), the
National Aeronautics and Space Administration (NASA), and the
Department of Energy (DOE). Each of these agencies has experience
in developing and using high-performance computing technology.
S. 272 will provide for a well-planned, well-coordinated
research program which will effectively utilize the talents and
resources available throughout the Federal research agencies. In
addition to NSF, NASA, DOE, and DARPA, this program will involve
the Department of Commerce (in particular the National Institute of
Standards and Technology and NOAA), the Department of Health and
Human Services, the Department of Education, the United States
Geological Survey, the Department of Agriculture, the Environmental
Protection Agency, and the Library of Congress, as well. The
technology developed under this program will find application
throughout the Federal government and throughout the country.
S. 272 will roughly double funding for high-performance
computing at NSF and NASA during the next five years. Additional
funding -- more than $1 billion during the next five years -- will also
be needed to expand research and development programs at DARPA
and DOE. Last year, I worked closely with Senators Johnston and
Domenici on the Energy Committee to pass legislation to authorize a
DOE High-Performance Computing Program, and I hope to work with
them and the other members of the Energy Committee to see that
program authorized and funded in fiscal year 1992. Already, Senator
Johnston and others have introduced S. 343, which would authorize
DOE's part of this multi-agency program.
To fund DOD's part of the program, last year I worked with
Senators Nunn and Bingaman and others on the Armed Services
Committee to authorize and appropriate an additional $20 million for
DARPA's high-performance computing program, money that has been
put to good use developing more powerful supercomputers and
faster computer networks. Advanced computer technology was a
key ingredient of the allies' success in the Persian Gulf War, but we
cannot simply rely on existing technology, we must make the
investment needed to stay at the leading edge. It is important to
remember the Patriot missile and the Tomahawk cruise missile rely
on computers based on technologies developed through Federal
computer research programs in the 1970's. The High-Performance
Computing Act will help ensure the technological lead in weaponry
that helped us win the war with Iraq and that will improve our
national security in the future.
This same technology is improving our economic security by
helping American scientists and engineers develop new products and
processes to keep the U.S. competitive in world markets.
Supercomputers can dramatically reduce the time it takes to design
and test a new product -- whether it is an airplane, a new drug, or an
aluminum can. More computing power means more energy-efficient,
cheaper products in all sectors of manufacturing. And that means
higher profits and more jobs for Americans.
Perhaps the most important contribution this bill will make to
our economic security is the National Research and Education
Network, the cornerstone of the program funded by this bill. By
1996, this fiber-optic computer network would connect more than
one million people at more than one thousand colleges and
universities in all fifty states, allowing them to send electronic mail,
share data, access supercomputers, use research facilities such as
radio telescopes, and log on to data bases containing trillions of bytes
of information on all sorts of topics. This network will speed
research and accelerate technology transfer, so that the discoveries
made in our university laboratories can be quickly and effectively
turned into profits for American companies.
Today, the National Science Foundation runs NSFNET, which
allows researchers and educators to exchange up to 1.5 million bits of
data (megabits) per second. The NREN will be at least a thousand
times faster, allowing researchers to transmit all the information in
the entire Encyclopedia Brittanica from coast to coast in seconds.
With today's networks, it is easy to send documents and data, but
images and pictures require much faster speeds. They require the
NREN, which can carry gigabits, billions of bits, every second.
With access to computer graphics, researchers throughout the
country will be able to work together far more effectively than they
can today. It will be much easier for teams of researchers at colleges
throughout the country to work together. They will be able to see
the results of their experiments as the data comes in, they will be
able to share the results of their computer models in real-time, and
they will be able to brainstorm by teleconference. William Wulf,
formerly Assistance Director for Computer and Information Science
and Engineering at NSF, likes to talk about the "National
Collaboratory" -- a laboratory without walls which the NREN will
make possible. Researchers throughout the country, at colleges and
labs, large and small, will be able to stay on top of the latest
advances in their fields.
The NREN and the other technology funded by S. 272 will also
provide enormous benefits to American education, at all levels. By
most accounts, we are facing a critical shortage of scientific and
technical talent in the next ten years. By connecting high schools to
the NREN, students will be able to share ideas with other high school
students and with college students and professors throughout the
country. Already, some high school students are using the NSFNET to
access supercomputers, to send electronic mail, and to get data and
information that just is not available at their schools. In this way,
the network can nurture and inspire the next generation of scientists.
Today, most students using computer networks are studying
science and engineering, but there are more and more applications in
other fields, too. Economists, historians, and literature majors are all
discovering the power of networking. In the future, I think we will
see computers and networks used to teach every subject from
kindergarten through grad school. I was recently at MIT, where I
was briefed on Project Athena, a project to integrate computers and
networks into almost every course at MIT. Students use computers
to play with the laws of physics in computer models, to test airplane
designs in wind tunnel simulations, to improve their writing skills,
and to learn foreign languages. Many of the ideas being developed at
Project Athena and in hundreds of other experiments elsewhere
could one day help students and teachers throughout the country.
The library community has been at the forefront in using
computer and networking technology in education. For years, they
have had electronic card catalogues which allow students to track
down books in seconds. Now they are developing electronic text
systems which will store books in electronic form. When coupled to
a national network like the NREN, such a "Digital Library" could be
used by students and educators throughout the country, in
underfunded urban schools and in isolated rural school districts,
where good libraries are few and far between.
I recently spoke to the American Library Association annual
meeting in Chicago and heard many librarians describe how the
NREN could transform their lives. They are excited about the new
opportunities made possible by this technology.
The technology developed for the NREN will pave the way for
high-speed networks to our homes. It will give each and everyone of
us access to oceans of electronic information, let us use
teleconferencing to talk face-to-face to anyone anywhere, and
deliver advanced, digital programming even more sophisticated and
stunning than the HDTV available today. Other countries, Japan,
Germany, and others, are spending billions of install optical fiber to
the home, to take full advantage of this technology.
With this bill we can help shape the future -- shape it for the
better. This is an investment in our national security and our
economic security which we cannot afford not to make. For that
reason I was very glad to see the Administration propose a High-
Performance Computing and Communications Initiative, a program
very similar to the program outlined in S. 272. I intend to work
closely with Dr. Bromley and others within the Administration as
well as my colleagues in Congress to secure the funding needed to
implement this critically-important program.
I look forward to hearing the testimony of Dr. Bromley and all of
the distinguished witnesses who have made time in their very busy
schedule to be here today. And I look forward to working with my
colleagues on the Commerce Committee towards passage of this bill.
Statement of Mr. Tracey Gray
Vice President of Marketing
Government Systems Division
US Sprint Communications Company Limited Partnership
Before the Subcommittee on Science, Technology, and Space
of the Committee on Commerce, Science, and Transportation
United States Senate
Room 252, Russell Senate Office Building
March 5, 1991 2:00 p.m.
Hearings before the Senate
Subcommittee on Science, Technology, and Space
of the
Committee on Commerce, Science, and Transportation
on
S.272, The High Performance Computing Act of 1991
Washington, D.C.
March 5, 1991
Prepared Statement of
Mr. Tracey Gray
Vice President of Marketing for the Government Systems Division
US Sprint Communications Company Limited Partnership
INTRODUCTION
Thank you, Mr. Chairman and members of the Subcommittee. I am
Tracey Gray, Vice President of Marketing for the Government
Systems Division at US Sprint. I appreciate this opportunity to speak
with you on S.272, the High-Performance Computing Act of 1991.
As you know, US Sprint is the third largest telecommunications
carrier in the United States today - and the only all fiber, fully
digital network. US Sprint serves 90% of the Fortune 500 U.S.
companies with voice, data, and video services, and we offer
telecommunications services to 153 countries around the world.
My division, the Government Systems Division, is proud to serve over
500,000 government employees at 35 agencies under the FTS 2000
contract. In addition to FTS 2000, we are responsible for all business
relations and opportunities with the federal government. This
includes evaluating and assessing the risks and opportunities with
emerging technologies and applications in telecommunication
network solutions.
NREN APPLICATIONS
I would like to talk with you today about NREN, the National
Research and Education Network -- one component of the High
Performance Computing initiative. Mr. Chairman, the operative word
in that sentence is Network. High performance networking should
share equal billing with high performance computing.
US Sprint does not build supercomputers; we do not maintain or
operate an information infrastructure of databases; we do not
develop computer software tools or train supercomputer hardware
or software engineers. US Sprint does provide telecommunications
services -- based on state-of-the-art, fiber technology and advanced
network architectures. Fiber technology will be the network
infrastructure that supports the computing hardware necessary to
solve the Grand Challenges. This future network platform will allow
researchers to establish National Collaboratories among our nation's
laboratories and university research centers that will solve the Grand
Challenge problems such as global warming, the identification of new
superconduction materials, and the mysteries of cancer causing
genes.
While the Grand Challenge problems certainly require our attention,
US Sprint appreciates the Committee's understanding that industry
related problems exist that can benefit from the application of high
performance computing. This Committee's 1990 report on S.1067
rightly noted that a supercomputer model helped Boeing design an
737 airplane that was 30% more efficient. The petroleum industry
benefited when Arco used a Cray supercomputer to increase oil
production at its Prudhoe Bay field, resulting in a two billion dollar
profit for the company. An Alcoa supercomputer model reduced the
amount of aluminum needed for its soda cans by 10%, resulting in
transportation and production savings. Mr. Gore, your January 24
statement noted that Ford's engineers can simulate automobile crash
tests using supercomputers for a fraction of the cost of conducting
real life experiments. Each of these industry applications of
supercomputing benefits the American consumer and the national
interest through greater efficiencies, higher quality products,
increased cost savings, and improved productivity.
But let's not focus solely on supercomputers and connecting
supercomputers. Other research and engineering applications require
high speed networking, and by bringing other applications on to this
network, we can increase scale economies that could justify
investments in multi-gigabit networks.
For example, medical doctors are confronting a problem where
technology produces greater diagnostic capability, yet there are
fewer experts to interpret the data. The solution is teleradiology --
the process of digitizing and transmitting medical images to distant
locations - which allows the nation's top radiologists to access key
medical imaging from virtually anywhere in the United States in
seconds. Today, US Sprint's network can transmit diagnostic quality
images in approximately 37 seconds using multiple 56 kilobit per
second lines. The same image would take up to an hour and a half to
transmit over a traditional analog network using 9600 bits per
second.
Tomorrow's technology will allow real time full motion imaging and
require bandwidths substantially greater than 45 megabits per
second, the highest speeds available today. A radiologist at a distant
location will be able to watch fetuses move and hearts beat, and
provide immediate diagnostic feedback. High speed networks are
required for real-time image transfers because video compression
greater than 2.5:1 is destructive to the image's clarity.
Medical imaging is one of many high performance networking
applications. Computer Aided Design/Manufacturing (CAD/CAM) is
another. American industry will remain strong, if they have the best
communication tool to complete their work. Interactive CAD/CAM
will allow industry to work more quickly and efficiently, allowing
widely dispersed engineers to participate in the design process
without exchanging roomfuls of paper.
NREN TECHNOLOGY
The question posed by the legislation, however, is how
supercomputers can be made accessible to more users. And the
answer is the development of supernetworks with multi-gigabit
capacity - or NREN.
US Sprint is working with developments that would support the
NREN objectives. We are developing plans for a broadband test bed
akin to those established under the leadership of the National Science
Foundation (NSF), the Defense Advanced Research Projects Agency
(DARPA), and the Corporation for National Research Initiatives
(CNRl). US Sprint is a partner in a of a Midwest coalition that is
working with DARPA to develop a network concept plan for a
terrestrial, fly- over imaging application for the Department of the
Army's Future Battle Lab. The terrestrial, fly-over project would take
satellite pictures and convert them into computer-developed, "three
dimensional" landscapes that would allow the user to "fly over" or
"walk through" the terrain. Generals could "see" a battlefield without
sending out scouts!
Additionally, US Sprint has recently become an international vendor
for NSFNET providing links to research networks in France and
Sweden, and we now serve on NSF's Federal Networking Advisory
Committee to the Federal Networking Council.
Although many advances are being made towards the development
of the systems necessary for gigabit networks, many hurdles remain.
The fundamental building block required for gigabit networks exists
today. Fiber optic cables with ample bandwidth to support multi-
gigabit and higher transmission speeds criss-cross our country. US
Sprint's all fiber optic network operates today with backbone speed
of 1.7 Gbps. We are currently testing 2.4 Gbps optic equipment in our
labs for installation on our high capacity routes next year. Our
transmission equipment vendors are developing the next generation
of optic systems with transmission speeds of 9.6Gbps.
Switching platforms also continue to advance with cell relay
technology. Many believe that cell relay switching best supports the
bandwidth-on-demand services essential to high speed networks.
Small, non-standard cell relay switches capable of switching traffic at
150 Mops are on the market today. International standards for cell
relay are advancing rapidly, with many projected for completion by
1992. Nonetheless, difficult network design problems remain in cell
relay technology such as traffic congestion and routing. American
researchers are working toward solutions to these problems.
To achieve the NREN goals, compatible telecommunication and
computer standards must be written for the signaling, operation,
administration and management of high speed networks. These
network support systems are as important to the implementation of
the NREN as the transmission and switching systems. The
development of standards for these support systems requires careful
consideration and must parallel the evolution of gigabit technologies.
US SPRINT POSITION
Mr. Chairman, US Sprint fully supports the intent of the High
Performance Computing initiative. We are convinced that without
government seed money, supercomputer networking will be slow to
mature. Let me share two related thoughts with you, however, about
the legislation and the implementation of the legislation pertaining to
network applications and to the Committee's intent to phase the
NREN into commercial operation.
First, with respect to network applications, to speed the development
of high speed networks, US Sprint recommends broadening the scope
of the legislation to include a variety of high speed networking
applications. I have briefly described two applications, not requiring
supercomputers, that would serve pressing, existing needs.
Providing funds for applications research could stimulate many more
ideas within the research community. Each of these application ideas
could support a new group of users, further extending the benefits of
high speed networking to society. With applications as the driver,
high speed networks will grow in scale and ubiquity throughout the
country.
My second point, and one that I think is a concern to the Committee
as well, pertains to the phase-in to commercial operation, one of the
objectives to be realized by the network. Although the bill includes
language that the NREN be "phased into commercial operation as
commercial networks can meet the networking needs of American
researchers and educators," there is no path--given the current
development of the NSFNET--that gets us from here to there.
In fact, the government is creating a private--a dedicated--
telecommunications infrastructure that parallels the commercial,
public networks operating in the U.S. today. Rather than duplicate
commercial facilities with a government owned and operated
telecommunications system, we suggest that the NREN be established
through public network services--where the government's
networking requirements are combined with the public's
requirements in the development of commercial networks.
Otherwise, it is not clear how we will ever "phase" from a dedicated
U.S. government network to commercial networks.
With a public network service, industry would develop, own, and
operate the facilities to provide gigabit capability and offer that
capability as a service to the Government and other industry users.
In this environment, users are not obligated to full time, dedicated
service, but are oriented to a preferred, bandwidth-on-demand
scenario. A public, high speed network service would be positioned
much like today's public, long distance or virtual private networking
services. Users only pay when they use the service.
By evolving NREN as a public network service, the government also
takes advantage of existing network platforms. US Sprint for
example, offers a fully deployed, ubiquitous, network service. We
fully integrate today's telecommunication requirements combining
voice, data, and video services with a single network platform. . US
Sprint integrates the management, operation, and administration of
that network into a single organization. NREN can only duplicate
public network features like these at tremendous cost. By leveraging
the existing infrastructure of public networks, the government can
realize the development of a more robust NREN, sooner, and at less
cost.
RECOMMENDATIONS
In short, Mr. Chairman, US Sprint recommends that the High
Performance Computing Act of 1991 address two issues.
First, the bill should authorize the funding of academic research for
application s requiring high speed network capacity in addition to
connecting supercomputers. As noted above, sophisticated medical
imaging requires higher speed networks. Similar applications that
require high speed networking should be funded under this
initiative. US Sprint believe that funding this type of research will
stimulate additional high speed network applications further
justifying the development of the network.
Second, the Committee should ensure that the design of the NREN
does not lead to a government owned and operated network. NREN
should be developed to share the gigabit capacity of existing public
networks and enjoy the advantages that public network operators
bring to their commercial customers. NREN could well operate as a
virtual private network on an existing public network, but it should
not operate as a separate network.
Mr. Chairman, US Sprint sees the NREN developing more fully, more
economically, and more quickly if it were to be developed as a
shared, or public, network.
We appreciate the opportunity to address the Committee. I will be
happy to answer any questions that you may have.
Thank you, Mr. Chairman.
Summary Statement
Tracey Gray, Vice President of Marketing
Government Systems Division
US Sprint fully supports the intent of the High Performance
Computing initiative. We are convinced that without government
seed money, high performance computing and high performance
networking will be slow to mature.
US Sprint believes that the Committee should take two steps to help
realize its goal of establishing a multi gigabit network by 1996.
First, the Committee, in its bill, should authorize the funding of
academic research that requires high performance networking
without requiring, necessarily, high performance computing. We
advocate this position because we are convinced that unless
additional applications for high speed networking are developed,
industry will not be able to justify the costs of developing multi-
gigabit networks devoted to linking supercomputers.
Second, US Sprint believes that the Committee should ensure that the
NREN, the National Research and Education Network, is not
established as a government owned and operated, dedicated
network. Rather, we believe that the NREN should be developed as a
public network service to take full advantage of the near and long
term technical features and administrative support systems
developed by public network providers. In our mind, the
industry/government partnership envisioned by the legislation will
only come to fruition if we marry our financial and technical
resources in the development of shared, public networks instead of
pursuing the development of exclusive, private networks. Moreover,
unless NREN develops as a shared resource, we cannot envision how
NREN will be phased into commercial operation as the legislation
anticipates.
US Sprint commends the Committee's foresight and initiatives with
respect to high performance computing and high performance
networking. We look forward to lending our expertise and resources
to help in meeting the Committee's legislative goals.
STATEMENT OF SENATOR ERNEST P. HOLLINGS
HEARING ON S. 272, THE HIGH-PERFORMANCE COMPUTING ACT
TUESDAY, MARCH 5, 1991
I am a cosponsor of S. 272, the High-Performance Computing Act,
because this is the kind of far-sighted legislation that should be a
priority here in the Senate. S. 272 addresses the long-term
economic, educational, and national security needs of this country.
We cannot just focus on the problems of today; we need to find
solutions to the problems of tomorrow as well.
The bill we are considering today will accelerate the
development of new technology and, just as importantly, speed up
the application of that new technology. By creating a National
Research and Education Network (NREN), this bill will link our
university labs to labs and factories in the private sector so they can
more effectively use the research done by university researchers.
Today the flow of information is truly global; the results of
research done at MIT now may be applied in a laboratory
somewhere else tomorrow. The NREN would help us take advantage
of that research. If our best research scientists are in constant,
instantaneous communication, through high-speed computer
networks, with the engineers and product designers in American
industry, we have a huge competitive edge.
The NREN and high-speed, commercial networks based on NREN
technology will not develop spontaneously. Federal leadership and
Federal investment are needed to spur the private sector to develop
these networks. S. 272 provides for this spur. It is an important
step toward exploiting the full potential of fiber optics in our national
telecommunications system.
The NREN and high-speed fiber optic networks are particularly
important to states like South Carolina. In South Carolina, we have
many colleges and universities which lack the resources available at
other research universities. The NREN will provide them with access
to facilities presently available only at places like Caltech and
Harvard. With the NREN, a researcher at the University of South
Carolina would have access to very fastest supercomputers available
anywhere. A researcher at Clemson would be able to connect to a
radio telescope halfway across the country and collect data and
compare his or her results with colleagues around the country.
The applications of the NREN in education are even more exciting.
With access to the NREN and the "Digital Libraries" of electronic
information connected to it, at the smallest colleges in South Carolina,
and many high schools, students would be able to access more
information from their computer keyboard than they could find in
their school libraries. The NREN would broaden the horizons of
students at small colleges, two-year technical colleges, historically
black colleges -- at every college in South Carolina.
This is important legislation, and I look forward to working with
Senator Gore and others on the Commerce Committee on the bill.
TESTIMONY BY
DR. MALVIN H. KALOS
DIRECTOR, CORNELL THEORY CENTER
TO THE SENATE COMMITTEE ON SCIENCE, TECHNOLOGY,
AND SPACE
HEARINGS ON S. 272, THE HIGH-PERFORMANCE
COMPUTING ACT OF 1991
TUESDAY, MARCH 5, 1991
Mr. Chairman, it is a privilege to be invited to comment on the "High
Performance Computing Act of 1991" in the company of such a
distinguished group of representatives of government, industry, and
academia.
I am Malvin H. Kalos, Director of the Cornell Theory Center, and a
professor of physics at Cornell University. The Theory Center is an
interdisciplinary research unit of Cornell University, dedicated to the
advancement and exploitation of high performance computing and
networking for science, engineering, and industrial productivity. As
you know, the Theory Center is one of the National Supercomputer
Centers supported by the National Science Foundation. The Center
also receives support from the State of New York, and from industry.
My career spans 40 years of work with computers as a tool in
physics and engineering. I have worked in universities, industry, and
as a consultant to the Los Alamos, Livermore, and Oak Ridge national
laboratories in research devoted to the application of high
performance computing to further their missions.
We are witnessing a profound transformation of our scientific and
engineering cultures brought about by the advent and adoption of
high-performance computing and communications as part of our
technological society. The changes, some of which we see now, some
of which we easily surmise, and some of which we can only guess at,
have had and will continue to have wide-reaching benefits. Our
economic well-being and the quality of our lives will be
immeasurably improved. I salute the foresight and leadership of the
authors and cosponsors of this Bill, and the Administration. Senator
Gore, Congressmen Hollings and Brown, and the President all
understand the deep and positive implications for our future. We are
also grateful for the support of Congressmen Boehlert and McHugh
whose backing of our efforts at Cornell and for the entire program
has been very strong.
The Director of the Office of Science and Technology Policy, Dr.
Bromley, has done essential work in translating the ideas into
effective policy. The Federal Coordinating Council for Science,
Engineering, and Technology (FCCSET) has, for the first time, brought
unity into the Federal approach to high-performance computing. This
is a well designed, well integrated program that shows good balance
between the need to exploit advancing supercomputing technology,
the need for very high performance networking, and the need to
bring these new tools to the widest possible community through
research and education.
I will begin with some historical and philosophical remarks about
science, using the history of physics, which I know best. Science is
not a dry collection of disconnected facts, however interesting. The
essence of science is the dynamic network of interconnections
between facts. For a scientist, making a connection never perceived
before can be the highlight of a career; the more distant the
connection, the more it is valued. Our aim is to connect all we know
in a seamless web of understanding. Historically, the greatest
contribution of the greatest scientists have been such connections:
Newton's between the fall of an apple and the motion of the Moon
and planets; Maxwell's between the phenomena of electricity,
magnetism, and the propagation of light; Einstein's leap of
understanding connecting quanta of light and the photoelectric effect.
These connections must be, to the greatest extent possible,
mathematical and quantitative, not merely verbal or qualitative.
Making these connections in a quantitative way remains at the heart
of pure science today, but it has become harder as we try to probe
into more and more complex phenomena, phenomena that cannot be
analyzed by the mathematical tools at our disposal. There are many
important examples in science that shed light on this paradigm.
Chemistry is one of our most important sciences, one that contributes
enormously to our grasp of the physical world and one whose
applications lie at the core of our understanding of materials we use,
wear, and eat, and of our health. The fundamental understanding of
chemistry lies in quantum mechanics and electricity, well understood
since the 1930s. Yet the translation of that scientific understanding
into quantitative knowledge about chemical materials and processes-
- polymers, chemical catalysis, drugs both harmful and healing, is
very far from complete. Quite properly, chemistry is still largely an
experimental science. But the power of modern supercomputers is
transforming the face of chemistry at every level. We are coming to
understand how electrons cooperate to bind atoms into molecules,
molecules into larger structures, and to elucidate their structural,
dynamic, and biological effects. However, extraordinary numerical
precision, which can only be attained by very powerful
supercomputers, is required for this vital work.
Many other areas of science involve this kind of systematic
connection among different phenomena at different scales of length
or energy, including biology and medicine, the physics of materials,
and astrophysics.
The role of computation in linking disparate scientific fields is not a
contemporary development. The early evolution of modem
computers was dominated in the 1940s and 1950s by John von
Neumann, who was also a great mathematician. He designed
computers so that the very difficult questions that underlie such
scientific and engineering problems as fluid flow could be explored
and understood. Only later was it recognized that computers were
also important business tools. The essential role of computers in
science and engineering were well appreciated by many groups in
the United States, including the national laboratories, and their use
contributed very much to the development of nuclear weapons,
fusion technology, and the design of aircraft.
The use of computers in academic science and engineering evolved
more slowly, partly because of the failure of many to see the
possibilities, partly because the policies of the Federal government at
the time discouraged scientists from participating fully. My own
career was impacted negatively by these policies. It was the
leadership of a few scientists, notably Dr. Kenneth Wilson, who
created the modern climate of respect for the accomplishments and
possibilities of computational science in the future of our country.
The constructive contributions of the Congress and the National
Science Foundation in creating the National Supercomputer Centers
are noteworthy. That creation was, in a profound sense, the mark of
the entry by the mainstream of American research into the era of
computational science at the heart of science
and engineering.
It is also important to note that computational science is now an
essential tool in experimental science as it is currently practised. The
most advanced scientific instruments, optical and radio telescopes,
particle accelerators, and computers themselves are studied,
designed, optimized, and verified with computer simulation. Data
collection is usually automated with the help of computers, and the
reduction to comprehensible data sets and pictures may involve
enormous computations. Exchange of large data sets and the
cooperative work in understanding them will require very large
computations and very heavy use of future high capacity data
networks. Finally, in many cases, even reduced data are
incomprehensible except when studied in the light of complex
theories that can be understood only by simulation.
Now the entire scientific and engineering community of the country
has the opportunity to exploit these new tools. Many researchers are.
Important new scientific discoveries are being made. New ideas and
connections are seen everywhere. More important, students and
young scientists, who are always the very heart of any important
scientific change, are involved. They are coming to understand the
techniques, the promise, and the limitations of computational science.
Their knowledge and its applications are the most important
products of our efforts, and they will carry the message to the rest of
our society and to the future. It is they who will have the most direct
impact upon industry in the United States.
The science made possible throughout the nation by the resources of
the Theory Center spans all scales of length and energy from the
galactic through the planetary through the earth's crust, the behavior
of man-made structures, of materials at the microscopic level, to the
physics of elementary particles. From another perspective, it spans
the traditional disciplines of physics, chemistry, mathematics,
biology, medicine, all fields of engineering, and agriculture and
veterinary medicine.
Although I describe research at or made possible by the Theory
Center, the other National Centers, at San Diego, Champaign-Urbana,
and at Pittsburgh, can easily list an equally impressive set of
accomplishments in pure and multidisciplinary science.
It is perhaps unfair to cite a few at the expense of so many others,
but the work of Stuart Shapiro and Saul Teukolsky on fluids and
fields in general relativity is outstanding and has been recognized by
a significant prize, the Forefronts of Large-Scale Computation Award.
Their research comprises both the development of mathematical and
numerical methods for the exploration of astrophysical and
cosmological phenomena and the use of these methods to develop
quantitative understanding of the formation of black holes and the
characteristics of gravitational radiation.
John Dawson of UCLA uses the Theory Center resources to study the
unexpected results of the Active Magnetic Particle Tracer Explorer
experiments. In these, barium and lithium were injected into the
earth's magnetosphere, creating, in effect, an artificial comet. The
observations contradicted existing theories and simulations. Dawson
and Ross Bollens constructed a hybrid theory and simulation that
models the observed effect.
Henry Krakauer of the College of William and Mary uses a modern
"density functional" theory of electronic structure to examine the
nature of the electron-phonon interaction, known to be responsible
for low-temperature superconductivity. The aim is to determine its
role in high- temperature superconductivity. Work like this is being
carried out throughout the world and will require the fastest parallel
supercomputers of the future. Having them available to American
researchers, including those who are not at major research
universities, gives them and American industry a competitive edge.
The research of Harold Scheraga and his group at Cornell into the
three-dimensional structure of proteins shows an equally broad
range of activity: the investigation of the fundamental interactions of
the amino acid units with each other and with solvent atoms, the
basic computational techniques needed to find the optimal structure,
and the biochemistry of proteins. This is research that is particularly
well suited to highly parallel computing, and will require, in the long
run, the full use of future teraflops machines.
Understanding the properties of the earth's crust is the subject of the
research of Larry Brown and the Consortium for Continental
Reflection Profiling (COCORP). This national group uses the
supercomputers to reduce, display, and interpret the huge data set
that is gathered by seismic probing (to 30krn or more) of the
continental crust.
I cited earlier the fundamental importance of scientific computing in
enabling the connections among different phenomena within
scientific disciplines. Even more important is its role in permitting
quantitative connections among different disciplines, that is, in
supporting multidisciplinary research. Every one of the large
problems that confront our society, and to whose solutions we expect
science to contribute, is in some sense a multidisciplinary problem.
For example, issues of the environment involve many sciences --
chemistry, physics, engineering, fluid flow, biology, and materials.
Medicine is equally demanding in its call upon diverse science. As
we have indicated, biochemistry and its relations to chemistry and
physics plays a central role in medicine. But other areas are
important as well. As part of my oral presentation, I will show a
video of a supercomputing study of the uses of ultrasound in the
treatment of eye tumors. The building of modem prosthetic devices
uses many resources of computation, from the reduction of CAT scans
to the computational optimization of the mechanical properties of the
devices. Understanding blood flow in the heart requires a mastery of
fluid dynamics of viscous media plus the knowledge of the elastic
properties of the heart and its valves.
Bringing the knowledge from these fields together to make
quantitative predictions about the effects of some technological or
regulatory proposal is a difficult undertaking, one that is utterly
impossible without the use of computational modeling on high-
performance computers. Computational modeling is the indispensable
natural language of quantitative multidisciplinary research.
An outstanding example of such work is that by Greg McRae of
Carnegie Mellon University. He uses supercomputers and
supercomputer-based visualization to explain from basic chemistry,
fluid mechanics, meteorology, and engineering the scientific effect
that underlie the development of air pollution in the Los Angeles
Basin, and the probable effects of fuel changes and regulatory
procedures. His results have been used to influence regulatory
policy constructively.
The Global Basins Research Network (GBRN), a consortium directed
by Larry Cathles of the Geology Department of Cornell University and
by Roger Anderson of Columbia University's Lamont-Dougherty
Laboratory and which includes eight academic and 11 industrial
partners, has as its goal the multidisciplinary understanding of the
chemical, physical, and mechanical processes that occur in a
sedimentary basin such as the one in the Gulf of Mexico below
Louisiana. They have assembled a composite database of the
observations of the basin and are using computational modeling to
explain the data. But simply the collection and display in a coherent
visual way has led to new and deeper understanding of the geology.
The outcome of this understanding is very likely to improve oil
recovery world-wide. I will also show a video clip of a visualization
of the data set that was prepared jointly by the Theory Center and
the GBRN.
It is important to note that this research covers a wide range of
partners, geographically dispersed, and the that the medium of
information exchange is usually visual. High- performance
networking is essential to the GBRN and to similar scientific
enterprises.
Another important development is the establishment at Cornell of
the Xerox Design Research Institute, with the participation of the
Theory Center, the Computer Science Department, and the School of
Engineering. Directed by Gregory Zack of Xerox, and involving
researchers from Xerox centers nationwide, the aim of the Institute,
quite simply, is to improve Xerox's ability to bring better products
more quickly to market. The techniques are those of computational
and computer science. A vital aspect of the research is the
development of methods whereby the geographically separate
centers can effectively collaborate. Again, high-performance
networking is key.
As our reach extends, the necessary partners required to carry out
important collaborative research will rarely be found at one
institution or even in one part of the country. Essential experimental
devices or data bases may exist anywhere. Rapid, concurrent access
is essential, and at higher demands in bandwidth. The NREN is
necessary for the full growth and exploitation of the scientific,
technological, and educational implications of computational science.
The GBRN and Xerox examples indicate how the greatest potential is
for industrial use.
The supercomputing community will soon find itself at a major
crossroads -- where the increases in performance needed for the
fulfillment of our scientific mandate will demand parallel
architectures. To exploit these new machines, a major retooling of
software and algorithms will have to take place. This is not a trivial
undertaking, yet it must be started very soon if we are to make
progress on the Grand Challenge problems in the mid-1990s.
The High-Performance Computing and Communications program will
offer us an essential opportunity to bridge the gap between today's
high performance vector machines and tomorrow's highly parallel
systems.
I have emphasized how science and its application to societal
problems are communal activities, activities that involve, more or
less directly, the entire scientific community. Bringing to bear the
transformation made possible by computational science in the most
complete and positive way requires that its techniques and strategies
be learned, used, and shared by the widest possible group of
researchers and educators. That means advancing the art, acquiring
the best and most powerful tools of hardware, software, and
algorithms, and coupling the community in the tightest possible
ways.
The "High-Performance Computing Act of 1991" is a vital step in that
direction.
Statement by DONALD N. LANGENBERG
Chancellor, The University of Maryland System
Before the Senate Subcommittee on Science, Technology, and Space
Committee on Commerce, Science, and Transportation
United States Senate
March 5, 1991
Donald N. Langenberg is Chancellor of the University of Maryland
System. With a doctorate in physics, Dr. Langenberg has held faculty
and administrative positions at the University of Pennsylvania and
the University of Illinois at Chicago. He served as Acting and Deputy
Director of the National Science Foundation. He is currently Chairman
of the Board of the American Association for the Advancement of
Science and Chairman of the Executive Committee of the National
Association of State Universities and Land-Grant Colleges. He chaired
the panel of the NAS/NAE/IOM Committee on Science, Engineering,
and Public Policy that authored the 1989 report, Information
Technology and the Conduct of Research: The User's View.
Mr. Chairman and Members of the Subcommittee:
Thank you for your invitation to testify on S. 272, the High-
Performance Computing Act of 1991.
I am Donald Langenberg, Chancellor of the University of
Maryland System. My view of the issues addressed by this bill has
naturally been shaped by my own experience. I am, or was, an
experimental solid state physicisL I have served as Deputy Director
and as Acting Director of the National Science Foundation. I am
currently CEO of an eleven-campus state university system,
Chairman of the Board of the American Association for the
Advancement of Science, and Chairman of the National Association of
State Universities and Land-Grant Colleges. These affiliations account
for some of my biases, but most are a result of my service as chair of
a National Research Council panel that wrote a 1989 report entitled
Information Technology and the Conduct of Research: The User's
View.
My service on the panel convinced me that the current
breathtaking rate of change in information technology will inevitably
force historic changes in our institutions for managing information.
Nowhere is this more evident than in the research and education
communities that both create important new developments in
information technology, and are often bellwethers in its use. It is the
viewpoint of these communities that I will try to represent this
afternoon.
Information is the fundamental stuff of both research and
education. Research and education are about the creation of
information and its transformation into knowledge and
understanding, for our individual and collective benefit.
Modern information technology has presented us with a
challenge of unprecedented scale. The Library of Congress contains
about 10 terabytes of information. It took us over two centuries to
collect ii It's stored nearby in an impressive collection of expensive
real estate. Medical imaging machines nowadays produce that much
information every week or so. The particle detectors of the
Superconducting Super Collider will one day engulf their designers
with that much information every few seconds. NASA already has
1.2 million magnetic tapes containing data from past missions, and its
archives are growing by about one Library of Congress every year.
In ten years, if all goes according to plan, NASA will be piling up
about fifty Libraries of Congress each year. Everywhere one looks,
similar gushers of information exist or are in prospect.
Fortunately, modern information technology also promises to
give us the means to meet this challenge. Transforming promise into
reality, however, will take time, skill, resources, and, above all,
wisdom. In my opinion, S. 272 represents a major contribution to
that transformation. I strongly support its passage into law.
Let me make a few points related to the work of our NRC panel.
1. The Panel found that there exist significant technical,
financial, behavioral, and infrastructural impediments to the
widespread use of information technology in research. Though the
Panel's charge was confined to research, I believe the same
impediments exist with respect to education. The Panel made three
main recommendations and a host of subrecommendations for
dealing with these impediments. S. 272 responds to most of them.
2. One of the Panel's three principal recommendations was that,
"the institutions supporting the nation's researchers, led by the
federal government, should develop an interconnected national
information technology network for use by all qualified researchers."
S. 272's National Research and Education Network (NREN) responds
directly to the need reflected in this recommendation, and also to the
very important collateral need of the educational sector. In my
judgment, NREN will revolutionize both research and education (in an
evolutionary way, of course).
3. When one thinks of what NREN might do for education, one
thinks first of the education of scientists and engineers, then perhaps
of the incredible potential inherent in linking NREN to every
elementary school, secondary school, public library, and museum in
the country. There is another educational need of utmost importance.
I believe that part of the challenge we face is the creation of an
entirely new kind of institutional infrastructure for managing the
new information technology, led and supported by a new breed of
information professionals. The latter may bear some resemblance to
librarians, or to computer scientists, or to publishers. Whatever they
might be, we need to create schools for training them and institutions
within which they can function. That means educational and
institutional innovation of a kind S. 272 appears well designed to
foster.
4. The most important words in the title of our panel report reflect
our most important observation. They are "the user's view." In
simple terms, the Panel concluded that the development of
information technology and its applications in the conduct of
research (and, I would add here, education) are far too important to
be left to the experts. The Panel cautioned that planning and
development should be guided by users of information technology,
both current and prospective, Dot by information specialists,
information scientists, information technologists, or local, national,
and international policymakers. It may not invariably be true that
"the customer is always right," but institutions that create technology
or make policy without a clear understanding and appreciation of the
real needs of their clients and constituents risk making serious and
expensive blunders. S. 272 calls for the advice of users in the
development of the National Research and Education Network I
especially applaud this provision.
5. In my preface to our panel's report, I wrote:
"I share with many researchers a strong belief that much of the
power of science (whether practiced by scientists, engineers, or
clinical researchers) derives from the steadfast commitment to free
and unfettered communication of information and knowledge. This
principle has been part of the ethos of the global research
community for centuries, and has served it and the rest of humanity
well. If asked to distill one key insight from my service on this panel,
I would respond with the assertion that information technology is of
truly enormous importance to the research community, and hence to
all humanity, precisely because it has the potential to enhance
communication of information and knowledge within that community
by orders of magnitude. We can now only dimly perceive what the
consequences of that fact may be. That there is a revolution
occurring in the creation and dissemination of information,
knowledge, and ultimately, understanding is clear to me. It is also
clear to me that it is critically important to maintain our commitment
to free and unfettered communication as we explore the uses of
information technology in the conduct of research."
What I asserted there about research, I would assert now about
education. If I am right, then by far the most profoundly important
consequence of the creation of NREN will not be the expedition of
research or the improvement of next year's balance of trade. It will
be the fundamental democratization of all the world's knowledge.
That means placing the accumulated intellectual wealth of centuries
at the beck and call of every man, woman, and child. What that
might mean can only be guessed, but let me reminisce for a moment.
I grew up in a small town on the Great Plains. In that town was a
Carnegie Library, one of hundreds Andrew Carnegie endowed across
the nation. That modest building and the equally modest collection it
housed opened the world to me. I have been grateful to the
Pittsburgh steelmaker ever since. What if I had had direct personal
access to the Library of Congress, the British Museum, the Louvre,
and the Deutsches Museum, all in the course of a summer afternoon
in North Dakota? Imagine!
My point here is that there is an overriding public interest in NREN
and in the rest of the provisions of S. 272, an interest that transcends
research and its industrial applications, or issues of governance and
the timetable for commercialization. We have an opportunity here
for an American achievement of truly Jeffersonian proportions. Let's
not blow it!
6. Finally, I note with approval that S. 272 identifies the National
Science Foundation as the lead agency for the development of NREN.
The choice is wise, I think NSF has a demonstrated capacity to
manage large complex technical operations. Unlike other S&T
agencies, NSF's focus is not on some "mission," but on its "users," i.e.,
its client science and engineering communities. And, perhaps most
important, alone among federal agencies NSF bears responsibility for
the support of research across the full spectrum of scientific and
engineering disciplines, and for the training of those who perform
the research, and for the general education in science and technology
of everybody else.
You will have gathered that I have considerable enthusiasm for S.
272. I do! I urge you and your colleagues to enact it into law.
Testimony of
David C. Nagel, Ph.D.
Vice President, Advanced Technology
Apple Computer, Inc
Government Affairs Office
1550 M Street, N.W., Suite 1000
Washington, D.C. 20005
(202) 872-6260
On Behalf of the Computer Systems Policy Project
(CSPP)
Before the Science, Technology and Space Subcommittee
of the
Senate Commerce, Science and Transportation Committee
S.272
THE HIGH PERFORMANCE COMPUTING ACT OF 1991
March 5, 1991
Introduction
Apple Computer, Inc. and the other members of the Computer
Systems Policy Project (CSPP) are very appreciative for the
opportunity to appear before this Subcommittee on the issue of high
performance computing. As several of us have said in previous
appearances before this subcommittee, the health of the U.S.
computer industry is inextricably tied to the future health of the
nation as a global economic power. Although the U.S. has been for
decades preeminent in both the development of the most advanced
computer technology in the world and in the capture of the largest
share of the global computing systems market, that leadership is
being steadily eroded by our global competitors.
In purely economic terms, the U.S. computer systems industry
plays a vital role in the U.S. economy. In 1989, for example, our
industry exported more than $22B in computer equipment alone, or
more than 6% of total U.S. exports that year. Our industry employs
almost 600,000 workers in the U.S. When we look beyond the
immediate economic picture and into the future, few would argue
with the belief that the health of the computer systems industry will
serve as a bellwether to the overall health and leadership of the U.S.
as a global economic and industrial power. It is difficult to think of
significant technical advances over the past two decades in any
segment of the economy that have not relied on computer systems.
The computer systems industry is clearly a building block for other
industries. Computer systems products are necessary and critical
components of virtually all modem manufacturing and service
industries and development and operation of most of the
sophisticated weapons systems in the U.S. arsenal would be
impossible without computer systems and electronic components.
In the fall of 1989, the eleven largest computer systems
companies in the U.S. formed the Computer Systems Policy Project to
address technology and trade policy from the computer systems
industry perspective. As a reflection of the seriousness with which
the industry views the future of computer technology in the U.S., the
CSPP is an association of the Chief Executives of Apple, Hewlett-
Packard, Compaq, Cray, IBM, Control Data, Digital Equipment, NCR,
Sun Microsystems, Tandem and Unisys. One of the major goals in
forming the CSPP was to provide the industry and policy makers in
Washington, D.C. the data and perspective necessary to the
development of effective, long-range policies both in the
development of technology and in the improvement of our trade
position globally. Each of the member companies - including the
CEO's, Chief Technologists, and supporting staff - has made a
significant commitment to this project over the past year and a half.
CSPP began its study more than a year ago with an internal look
at the health of our industry including: an assessment of the
technologies that are critical to computer systems; an assessment of
how the United States is doing with these technologies compared to
our foreign competitors; and a prognosis for U.S. industry
performance into the future. In summary, the results of this initial
analysis were mixed. While the U.S. computer systems industry still
today is the strongest in the world (both in terms of technology
leadership and overall market share), our lead is diminishing rapidly
by almost all the measures we examined. In addition, leading
indicators of future health provide little cause for optimism.
In 1983, U.S. companies held a 83% share of the world market of
computer systems (including software). Between 1983 and 1989, our
share of the worldwide market declined from 83% to 61%. During this
same period, Japan's share rose from 8% to 22% and Europe's share
grew from 10% to 15%. Figure 1 shows a similar decline in our share
of the world market for computer hardware. Here the U.S. went from
supplying well more than half of the world's supply of computer
equipment to supplying less than our primary competitors, the
Europeans and Japanese. More troubling, the computer systems
industry went from a significantly positive contribution to the U.S.
trade balance all throughout the 1980's to a position in 1990 where
our imports almost exactly balance our exports (Figure 2). We note
that while the U.S.ratio of exports to imports moved steadily
downward over the past decade, Japan even more dramatically has
increased its export/import ratio from around 2 in 1980 to more
than 6 at the end of the 1980's. Finally, in the category of leading
indicators, the U.S. is failing significantly in the competition for
computer systems patents. Whereas in 1978, the U.S. received over
60% of all computer systems patents, by 1988 we were being
granted new U.S. patents only at the rate of 40% of the total. In the
aggregate, Japanese industry was awarded nearly as many patents in
the U.S. as were domestic manufacturers. Figure 3 illustrates these
trends.
While these findings are clearly troubling, the members of CSPP
recognize that the primary burden for staying competitive in the
global marketplace rests squarely with U.S. industry. Thus, to begin
our internal assessment, we examined our own investment levels
and competitive positions in the key technologies critical to success
in our highly competitive and highly technical business. We
identified, for example, 16 critical pre-competitive generic
technologies, and after significant analysis by the chief technologists
of the CSPP, concluded that the U.S. still leads the world in half of
these (data-base systems; processor architecture; human interface;
visualization; operating systems; software engineering; application
technology). Seven of the eight technologies for which the U.S. has a
lead worldwide are software intensive. We concluded also that the
U.S. lags the world in several critical technologies (displays; hard
copy technology; manufacturing technology; semiconductor
fabrication; electronic packaging). For the remainder (networks and
communication; storage, microelectronics; fiberoptics) a once solid
lead is diminishing. In contrast to the technologies for which the U.S.
holds a lead, the lagging technologies are mostly capital-intensive.
The chief technologists of the CSPP also concluded that the
prognosis for leadership in these technologies over the next five
years is that, without positive action, the U.S. position will erode
further in all 16 technology areas. It is with this perspective that the
CSPP began taking a closer look at what might be done to mitigate
these negative trends.
The CSPP supplemented its technology assessment with a review
of the role of government investment in R&D in the U.S. and other
countries (Figures 4 through 9) We came to some fundamental
conclusions. First, the overall level of R&D spending in the U.S. at
$135B in 1989 is substantial by any measure, greater than Japan and
the European Community by significant margins (Fig. 5). The overall
investment is split almost evenly between industry ($70B) and
government ($65.8B). The computer systems industry spends 21% of
private sector R&D, or about 10% of the total national investment in
R&D (Fig. 6a). The investment of the computer industry in 1989 -
more than $18B - is more than that of any other industrial sector and
represents a 26% increase over the amount we spent in 1988, during
a period when other industrial sectors were reducing their R&D
spending. In contrast to the level of investment of private industry,
the U.S. government only invested about 2% of its R&D portfolio in
generic technologies related directly to the computer industry (Fig.
6b). If we look at the electronics industry as a whole, about 30% of
private R&D was spent by the electronics industry while the
government invested only 6% of its R&D budget in electronics
research. In general, the ratio of private to government R&D
spending seems out of proportion relative to other industrial sectors
(e.g. aerospace, health care, etc.).
While we found that government spending on R&D has increased
significantly in absolute levels over the past 25 years, defense-
related spending has consumed a greater and greater share,
increasing from a historical share of 50% to a high of 70% in 1987. It
has remained at about the level of two-thirds of all government R&D
spending since that time (Fig. 7). By contrast, the Japanese
government allocates only 4% of its R&D budget to defense research
(Fig. 8). Selected European countries spend an average of 30% of their
government research budgets on defense. Among our principal
competitors, only the government of France spends a greater
percentage of its GNP on total R&D than does the U.S. government
(Fig. 9).
In our initial "Critical Technologies Report", the CSPP identified
R&D as one of the most significant factors in determining the success
of the industry's performance in 15 of 16 critical technologies. It is
therefore not surprising that the computer systems industry
performs 21% of private sector R&D and 10% of the total national
R&D effort. We recognize that this investment is our lifeblood.
Computer industry spending on R&D has increased at a much faster
rate than government spending over the last two decades, a practice
that has been required to keep pace with rapidly changing
commercial demands and increasing levels of international
competition.
How should the government and industry R&D investments be
split to maximize the benefits to U.S. industry and the U.S. economy?
First, investment in generic, pre-competitive technologies such as
electronics, materials and information technologies is important
because these are the building blocks for advancements in the
computer industry. Our assessment of the existing Federal research
effort reveals that the federal R&D investment is contributing
disproportionately little to these generic, pre-competitive technology
developments. The federal R&D budget is not focused in ways needed
to enhance and preserve our economic competitiveness given the
rapid pace of innovation and the R&D practices by other countries.
We acknowledge that the degrees of success of the various
European (ESPRIT, BRITE, EURAM) and Japanese (5th Generation
Computer Project, Super-Sigma Project, an advanced
telecommunications research institute, etc.) research projects are not
necessarily directly related to the absolute amount of government
spending. Rather, we believe that the relative success of the Japanese
projects (as reflected in the competitive position of Japanese
industry) illustrates the benefits of close cooperation between the
private and public sectors and of well-managed, focused efforts for
advanced technology projects. Moreover, while in the past, defense
R&D was a major source of technological advancement in the U.S. and
the computer industry in particular benefited from defense research
dollars, we believe that today, because of heightened demand for
improved commercial products and the accelerating pace of global
competition, the private sector is now the primary catalyst for
innovation.
We have concluded from these analyses that while the total
amount of federal R&D spending is probably adequate, it needs to be
managed more effectively if the U.S. computer industry is to be made
able to compete in the technology areas essential to our future
economic health. In short, we believe that federal R&D is not as
helpful to the computer industry as it might be.
Based on the data and on the strength of our analyses, CSPP has
outlined an initial set of technology policy recommendations. We
believe that these recommendations provide a strategy for better
focusing the federal R&D investment in pre-competitive, generic
technologies and that will help the U.S. meet international
competitive challenges by increasing industry involvement in federal
R&D priority setting. We believe that by working together, industry
and government can improve the nation's return on the total R&D
investment and can help to meet the international challenges to this
country's technological strength.
Recommendations for Improvement
We believe that the return on public and private investments in
R&D can be improved by coordinating research priority setting and
by allocating federal research dollars to more closely reflect the
private sector's role in developing the general technologies that are
key to the nation's economic growth. Increased investment in
microelectronics, information technologies, and materials will provide
a solid foundation for advancements not only in computer systems
but also in aerospace, medical, energy, environmental and virtually
every other area of research important to the future of our society.
The CSPP believes that government and industry jointly must
take the following first steps to improve the effectiveness of R&D
spending in the U.S.:
- Improve the mechanisms within OMB for reviewing federal
R&D spending;
- Increase industry input in setting federal R&D priorities to
better manage the federal R&D budget;
- Work with industry to set federal laboratory priorities to
improve the return on the national R&D investment; and
- Implement the High Performance Computing Initiative,
including a national network capable of bringing the benefits of
computing to every institution, household, and school in the nation.
CSPP has established three CEO-level working groups to develop
specific plans that will improve the economic return on the national
R&D investment by:
- Improving the industry participation in the federal R&D
priority setting and the federal R&D budget review process;
- Increasing the degree and effectiveness of interaction between
industry and the federal laboratories; and
- By implement the High Performance Computing and
Communications Initiative.
CSPP CEO's, chief technologists, and staff are actively working on
development of plans that address these three issues. Once
completed, we intend to make the results of these investigations
available to policy makers, including members of this Subcommittee.
Improving the R&D Budget Review Process
CSPP believes that the Administration and Congress must develop
a better sense of how its $76B investment is R&D is being spent. To
make the distribution of funds more understandable, we urge the
Congress and the Administration to develop a comprehensive
summary of the federal R&D budget - budget crosscuts - including
summaries of agency initiatives related to development of generic
technologies. We are pleased that OMB is providing budget
summaries in several key areas, including high performance
computing, the subject of this bill, and is considering the
development of similar information for other important research
areas such as materials.
We believe that by providing industry perspectives, the
effectiveness and usefulness of these budget summaries can be
improved. Once such summaries are available, strategies can be more
easily developed with industry participation to bolster investments
in needed areas or to shift priorities where necessary. This should be
done on an ongoing basis. We understand that industry participation
in such activities may be problematic because of ethical, regulatory,
and legal impediments and have established a CEO-level working
group to identify these impediments and to develop
recommendations for advisory mechanisms that are consistent with
legal and other requirements and that provide the greatest
opportunity for industry participation.
Increasing Interactions Between Industry and the National Labs
The Federal government spends billions each year on R&D in
federal labs, three-fifths of which goes to defense programs. CSPP
believes that much of that R&D, properly focused, could be
substantially more useful to the computer industry than it is today.
We believe that the nation's return on the federal lab investment can
be enhanced by increasing private sector input into lab activities and
by shifting some labs' research priorities to include generic
technologies that have commercial potential. CSPP has established a
CEO-level working group to recommend ways to improve the federal
laboratories' contributions to the national R&D effort, including
developing funding mechanisms for joint industry-lab projects of
interest to the private sector; by identifying potential and current
laboratory research projects and areas that could benefit the
computer industry; and by identifying research areas that lend
themselves to budget crosscut analysis. The results of this analysis
and recommendations will be issued later this year.
Implement the High Performance Computing and Communications
Initiative
Finally, CSPP fully supports and recommends fully funding a
national high performance computing and communication R&D
program, including implementing, in conjunction with academia and
the private sector, a national research and education network. Thus
the CSPP strongly supports the goals of S. 272 as well as the
Administration's High Performance Computing and Communications
(HPCC) Initiative. We believe that these efforts are critical to provide
the research infrastructure required to maintain our nation's
leadership in basic research and to expand our capability to perform
the applied research which leads to commercialization of technology.
The CSPP believes that the IIPCC will be instrumental in achievement
of national education and work force training goals, an achievement
that will be important increasingly to the economic and social health
of our nation.
CSPP will support this effort through a long-term project to
identify possible future applications of a network that will enhance
the quality of life and economic competitiveness of the nation. We
believe that computer and networking technology can help to solve
problems and to realize opportunities in U.S. homes, factories,
universities, workplaces, and classrooms. We have established a CEO
working group to identify innovative network applications, the
technological advances needed to accomplish them, and the best
ways to describe the applications benefits to the public.
We are working, as well, to acquaint ourselves with the HPCC
budget crosscut and with specific agency plans for research and
development. Once we complete this survey, we will examine the
relevance to the computer industry of the research being conducted
as part of the initiative. Later this year, CSPP will provide
recommendations to improve federal spending under the initiative.
Although we have not yet completed our analyses, CSPP believes
that creation of the NREN is an important first step toward realization
of what some have termed a national information infrastructure. This
national infrastructure would in effect constitute a very high
performance electronic highway that will address the needs of
business, schools, and individual citizens as well as institutions of
research and higher education. With 80 percent of the U.S. economy
classified broadly as services-related, the potential user base of such
a national infrastructure is immense. We believe that the existence of
such an infrastructure would allow the U.S. service economy,
including the education component, to operate significantly more
efficiently than today. We imagine that users of the national
information network will have access to immense digital libraries
and databases and that this access will transform both education and
commerce. We believe too that health care will be transformed by
the existence of a national digital information network. Vast
databases encompassing the basic biological sciences (molecular
biology, biochemistry, genetics) and applied medical applications
such as diagnostic and treatment data will be needed eventually to
improve both the quality and efficiency of the U.S. health care
delivery system.
We recognize and applaud the pioneering role that this
subcommittee and its Chairman, Senator Gore, have played in long
recognizing the importance of the development of a national
information infrastructure, a research and education network, and an
effective high performance computing program. The achievement of
a true national information infrastructure is an undertaking of very
significant complexity. The interim achievement of development of
an NREN will allow solutions to be developed to important technical,
policy, economic, regulatory, and social problems, solutions that will
point the way toward a true national information infrastructure for
the nation.
Specific Comments About S. 272
In Section 5 of the bill, we especially applaud the provision for a
National High Performance Computing Plan and the establishment of
a High-Performance Computing Advisory Panel consisting of
prominent representatives from industry and academia. These
provisions are in keeping with both the spirit and substance of CSPP
findings to date and the CSPP stands ready to participate in such an
Advisory Panel as needed. We applaud as well the Section 5
provision requiring the Panel to provide the FCCSET with an
independent assessment of whether the research and development
funded under the High Performance Computing Plan is helping to
Maintain United States leadership in computing technology.
In Section 6 of the bill, FCCSET is charged with development of the
"goals, strategy, and priorities" for an NREN. While we support this
provision as an important first step, we believe that some attention
should be given as the program progresses to issues which surround
development of a true national information infrastructure. For
example, agencies could be directed to perform analyses that would
identify impediments, regulatory or otherwise, toward achievement
of a true national information infrastructure and conduct other
studies or research that will lead to solutions to these impediments
as experience is gained in the development and operation of NREN.
Again, CSPP would welcome the opportunity to contribute to such
analyses and otherwise support the achievement of the goals of the
High Performance Computing Act of 1991.
Conclusions
CSPP recognizes that improving U.S. technology policy is a long-
term process that cannot be addressed by any one organization, any
single set of recommendations, or any given piece of legislation.
Improvement of U.S. technology is, nonetheless, an essential process
that will require cooperative R&D investments and the partnership of
the private sector and the government. Improving U.S. technology
requires a long-term commitment and a series of changes by
industry and government over time. Whether as independent CEO's
or as an industry, the members of the CSPP are committed to and
will remain involved in this process. CSPP believes that the high
performance computing and communication program will constitute
an important cornerstone by improving the harvest of federal R&D
investments in computing and other pre-competitive technologies
and by enhancing the competitiveness of the U.S. in the increasingly
competitive global economy.
Supercomputing Network: A Key to U.S. Competitiveness
in Industries Based on Life-Sciences Excellence
John S. Wold, Ph.D.
Executive Director
Lilly Research Laboratories
Eli Lilly and Company
Testimony
U.S. Senate, Commerce, Science and Transportation Committee
Science, Technology and Space Subcommittee
March 5, 1991
I am John S. Wold, an executive director of Lilly Research
Laboratories, the research-and-development division of Eli Lilly and
Company. Lilly is a global Corporation, based in Indianapolis, Indiana,
that applies advances in the life sciences, electronics, and materials
sciences to basic human needs -- health care and nutrition. We
compete in the pharmaceutical, medical-devices, diagnostic-products,
and animal health-products industries.
My responsibilities at Lilly include the company's supercomputing
program. With me is my colleague, Dr. Riaz Abdulla -- whom you just
saw on videotape. Riaz manages this program on a day-to-day basis.
I'm indeed pleased to have this opportunity to present my
company's views about the importance of a national commitment to
supercomputing and to a supercomputing network.
I'm sure that this subcommittee has heard -- and will hear much
more -- about the underlying technology required to support the
evolution of supercomputers and supercomputing networks. It's
important, I believe, that you share computing technologists'
excitement about their visions of supercomputing systems,
algorithms, and networks. But I believe it is just as important for you
to share the visions that motivate research-oriented institutions, like
Lilly, to invest in supercomputers and to encourage their scientists
and engineers to use these systems. It's important for you to hear
supercomputer users support S. 272.
Today, I'll try to articulate two levels of aspirations we at Lilly have
for our supercomputing program:
- First, we believe that Lilly scientists will use these powerful
new research tools to address fundamental research questions.
Answers to these questions will help us develop more-selective,
more-specific drugs with greater efficacy and fewer side effects.
These new medicines will represent important new products for our
company and support high quality, cost-effective health care for tens
of millions of people.
- Second, we believe that Lilly scientists will use these powerful
new research tools to expand the range of fundamental questions
they can explore. They may even use these systems to devise
entirely new ways of conducting research programs that probe the
staggering complexity of the human body.
In fact, supercomputing represents a revolution...a new wave...a
"paradigm shift" in the development of modern technology. In the
years ahead, scientists at Lilly and at other institutions will use this
extraordinary research tool to do things that we simply cannot
anticipate today. For instance, it's unlikely that pioneers of molecular
biology foresaw the applications of recombinant DNA technology that
have unfolded in the past I5 years or so.
Let's move, however, from the general to the specific. I'd like to
discuss supercomputing in the context of one company's decision.
making.
The investment by Eli Lilly and Company of millions of dollars in
supercomputing systems and training was a very basic business
decision. We believe that this technology will help us effectively
pursue our company's mission and meet its goals in. an ever-more
challenging environment. Today, I'll focus on our pharmaceutical
business. But many of the following points are also relevant to our
other businesses.
Long-term success in the research-based pharmaceutical industry
depends on one factor: innovation. We must discover and develop
new products that address patients' unmet needs. We must discover
and develop cost-effective new products that offer economic benefits
to patients, payors, and society as a whole. Whenever possible, we
must market innovative new products before our competitors do.
Innovation has never come easy in this industry. The diseases that
afflict our species represent some of the most daunting of all
scientific mysteries. Consequently, pharmaceutical R&D has
traditionally been a high-risk...complex... time-consuming...and costly
enterprise.
How risky is pharmaceutical R&D? Scientists generally evaluate
thousands of compounds to identify one that is sufficiently promising
to merit development. Of every five drug candidates that begin
development, only one ultimately proves sufficiently safe and
effective to warrant marketing.
The risk does not end there, however. A recent study by Professor
Henry Grabowski, of Duke University, showed that only 3 of 10 new
pharmaceutical products introduced in the United States during the
1970s actually generated any profits for the companies that
developed them.
How complex is pharmaceutical R&D? Consider just some of the
hurdles involved in the evaluation of each potential pharmaceutical
product that enters the development process:
- We must complete scores of laboratory tests that probe potential
safety and efficacy.
- We must manage global clinical tests of safety and efficacy that
involve thousands of patients in a dozen or more countries.
- We must formulate dosage forms of each product that best deliver
the active ingredients to patients.
- We must develop high-quality, cost-effective, environmentally
sound manufacturing processes for compounds that are often very
complex chemical entities.
- We must prepare mountains of research data for submission to
regulatory authorities in countries around the world. For instance,
one of our recent submissions to the U.S. Food and Drug
Administration involved 900,000 pages of data assembled in well
over 1,000 volumes.
How time-consuming are these complex R&D programs? Let's go step
by step. It usually takes several years to establish a discovery-
research program in which scientists begin to identify promising
compounds. It typically takes from 5 to 8 years for us to conduct all
the tests required to evaluate each drug candidate. Then it takes
another 3 to 4 years for regulatory authorities to consider a new
drug application and approve the marketing of the new product.
Consider this example. The Lilly product Prozac represents an
important new treatment for patients suffering from major
depressive disorder. Although we introduced Prozac to the U.S.
medical community in 1988, this innovative product came from a
research program that began in the mid-l960s. The bottom line is
that discovery-research programs often take a total of two decades
or more to yield new products.
How costly are these long, complicated R&D programs? Last year, a
Tufts University group estimated that the discovery and
development of a new pharmaceutical product during the l980s
required an investment of some $231 million in 1987 U.S. dollars.
That number is increasing rapidly. One reason is the ever-more
meticulous safety testing of drug candidates in humans. In the mid-
l970s, for instance, clinical trials of the Lilly oral antibiotic Ceclor
involved 1,400 patients. But recent clinical studies of our oral-
antibiotic candidate Lorabid encompassed 10,000 patients. Clinical-
trial costs constitute the largest portion of total drug-development
expenses -- and they have skyrocketed in recent years.
At Lilly, we believe that it will take $400 million to develop each of
our current drug candidates. And those costs do not include the
expenses required to build manufacturing facilities -- expenses that
can climb well into nine figures for hard-to-manufacture products.
Pharmaceutical R&D has become a "big science." The R&D programs
that yield new drugs need the same kinds of technical, management,
and financial commitment required to develop the most imposing
high technology products -- including supercomputers themselves.
I want to mention another dimension of our business environment.
The research-based pharmaceutical industry is unusually
competitive and cosmopolitan. Historically, no single company has
held more than 5 percent of the global market. Based on sales, the 10
or 12 top-ranking companies are very tightly clustered, compared
with most industries. These companies are based in France, Germany,
Switzerland, and the United Kingdom, as well as in the United States.
I would like to note that many of our competitors abroad are
mammoth technology-based corporations, such as Bayer, CIBA-
GEIGY, Hoechst, Hoffman-La Roche, Imperial Chemical Industries, and
Sandoz. These are truly formidable firms with superb technical
resources. Their pharmaceutical operations represent relatively small
portions of their total sales. By contrast, U.S. pharmaceutical
companies are, for the most part, smaller companies that have
focused their resources on human-health-care innovation.
In this competitive industry, the United States has an excellent
record of innovation. For instance, nearly half of the 60 new
medicines that won global acceptance between 1975 and 1986 were
discovered by U.S.-based scientists. In addition, the pharmaceutical
industry has consistently made positive contributions to this nation's
trade balance.
Over the past half decade, however, the research-based
pharmaceutical industry has experienced major changes. The rapid
escalation of R&D costs has helped precipitate major structural
changes in a sector of the global economy where the United States is
an established leader. An unprecedented wave of mergers,
acquisitions, and joint ventures has led to fewer, larger competitors.
In several cases, foreign companies have assumed control of U.S.
firms.
Competition in the research-based pharmaceutical industry will only
become more challenging during the 1990s and beyond.
Consequently, Lilly has evaluated many opportunities to reinforce its
capacity to innovate -- to reinforce its capacity to compete.
Supercomputing is a case in point:
- We believe that these powerful systems will help our scientists
pursue innovation.
- We believe that these systems will help us compete.
Now, let's move from business to science. Scientists have long been
frustrated in their efforts to address the fundamental questions of
pharmaceutical R&D. Only recently have we been able to begin
probing these questions. We've begun to probe them not through
experimentation but through the computational science of molecular
modeling. Prominent among these scientific priorities are the
following:
- The quantitative representation of interactions between drug
candidates and drug targets, especially receptors and enzymes
- The process by which proteins -- huge molecules that are
fundamental to life -- are "folded" into distinct con- figurations
through natural biological processes
- The properties that enable catalysts to facilitate essential
chemical reactions required to produce pharmaceutical products.
Today, I'd like to discuss the first of these challenges. I'll
concentrate on the interaction of drug candidates with receptors.
As you know, normal biological processes -- the beating of the
heart, the clotting of blood, the processing of information by the
brain -- involve complex biochemical chain reactions, sometimes
referred to as "cascades."
Let me give you an example. During these chain reactions,
natural substances in the body cause certain substances in the body
to produce other molecules, which, in turn, cause either the next
biochemical step in the cascade or a specific response by an organ or
tissue -- a movement, a thought, the secretion of a hormone.
Over the years, scientists have found that disease often occurs
when there is either too much or too little of a key molecule in one of
these biological cascades. As a result, research groups are studying
these chain reactions, which are fundamental to life itself.
The natural substances involved in these processes link with, or
bind to, large molecules, called receptors, which are located on the
surfaces of cells. We often use this analogy: a natural substance fits
into a receptor, much like a key fits into a lock. Many scientists at
Lilly -- at all research-based pharmaceutical companies -- are
focusing their studies on receptors involved in a host of diseases,
ranging from depression and anxiety to heart attack and stroke.
Their goal is to better understand these locks and then to design and
to synthesize chemical keys that fit into them.
In some cases, we want to design chemical agents that activate
the receptor and stimulate a biochemical event. Compounds called
agonists serve as keys that open the locks. In other cases, we want to
synthesize chemical agents that block the receptor and stop a natural
substance from binding to the receptor. These compounds, called
antagonists, prevent the biological locks from working.
Unfortunately, this drug-design process is fraught with problems.
Most importantly, receptors are not typical locks. They are complex
proteins composed of thousands of atoms. Moreover, they are in
constant, high-speed motion within the body's natural aqueous
environment.
This brings us to one of the most promising applications of
supercomputing technology. Mathematicians can formulate
equations that describe virtually anything we experience or
imagine: the soft-drink can on your desk or the motion of the liquid
in that can as you gently swirl it during a telephone conversation.
Each can be expressed in numbers.
Of course, those examples are relatively simple. But scientists
can also develop equations that describe the remarkable complexity
of meteorological phenomena...geological formations...and key
molecules involved in the body's natural processes. In recent years,
they have developed mathematical models describing the realistic
motion -- the bending, rotation, and vibration -- of chemical bonds in
large molecules, such as receptors. These models are emerging as
important tools for scientists probing how potential drug candidates
would likely affect the target receptors.
These mathematical descriptions are based on equations
involving billions of numbers. Conventional computers take days,
weeks, or even longer to perform related calculations. But
supercomputers do this work in fractions of a second. A second
computer then translates the results into graphic representations on
a terminal screen.
These graphic representations can serve as a new
communications medium -- and new "language" -- for scientists.
Teams of scientists can share the same visualized image of how a
specific chemical agent would likely affect the receptor in question.
They can quickly evaluate the probable effects of modifications in
the chemical. They can generate entirely new ideas -- and analyze
them. They can focus the painfully slow efforts required to
synthesize and test compounds on those agents that appear to have
genuine potential.
Supercomputers enable scientists to see what no one else has
seen. Historically, technical breakthroughs that have dramatically
expanded the range of human perception -- from early telescopes
and microscopes to modern cyclotrons and electron microscopes --
have enabled the research community to make landmark discoveries,
develop revolutionary inventions, and pioneer new academic
disciplines. We have every reason to believe that supercomputing
can do the same.
Now, let's return to the Lilly experience. Several years ago, the
interest in supercomputing began to grow at Lilly Research
Laboratories. We considered a number of ways to evaluate this
research tool. Obviously, supercomputers don't do anything by
themselves. They would only be relevant to our mission and our
goals if Lilly scientists actively and creatively embraced them. We
had to see whether our biologists, chemists, and pharmacologists
could really apply those graphic representations of receptors and
enzymes to real drug-discovery problems.
In January 1988, we took the first step: Lilly became an
industrial partner in the National Center for Supercomputing
Applications (NCSA) at the University of Illinois. This opportunity to
learn about supercomputing afforded us by interacting with the
NCSA proved to be an essential element in our supercomputing
decision. Many of our scientists were in- deed interested in learning
how to use supercomputers. Many of them quickly began to apply
the systems to their work.
In April 1990, our supercomputing program took a great step
forward with the installation of a Cray 2S-2/128 system at our
central laboratories in Indianapolis. Lilly scientists are using the
system at a far greater rate than we expected. In the meantime,
we've maintained our relationship with the NCSA to ensure
maximum support for our program and to keep abreast of new
developments in the field.
Our experience to date suggests three interrelated advantages of
supercomputing that we believe will make Lilly even more
competitive in the years ahead.
- We believe these systems will speed up the identification of
promising drug candidates. Supercomputing will enable Lilly
scientists to design new drug candidates that they otherwise would
not have even considered. Supercomputing may well cut days,
weeks, even months from the overall process required to identify
novel compounds.
- We believe these systems will foster great collaboration among
scientists from various disciplines who are involved in
pharmaceutical R&D. Productive research in our industry
increasingly depends on teamwork. supercomputer-generated
graphic simulations help scientists with diverse academic training to
share the same vision of crucial data. Again, these visual images
become a common language for scientists with different academic
training.
Moreover, supercomputing will make these multidisciplinary
research efforts more spontaneous, energetic, and intense. In the
past, our research was a step-by-step process in which long periods
often separated the formulation of ideas from experiments required
to test those ideas. But supercomputing helps teams of scientists
integrate their ideas and tests into a dynamic, interactive process.
These systems facilitate the communication, creativity, and decision
making that are critical to productive R & D programs.
- We believe these systems will encourage truly visionary
exploration. A spirit of unfettered inquiry drives scientific progress.
In the past, however, scientists were unable to test many novel ideas
because they didn't have sufficient computing power. Now,
supercomputers are motivating our scientists to ask "what if?" more
boldly than ever before -- and to help them quickly consider many
possible answers to their questions.
It's especially interesting to watch scientists actually get familiar
with supercomputing. As you know, good scientists are among the
most independent people in any society. They respect good theories.
But they demand empirical data to support the theories. In six
months, I've seen some pretty tough-minded chemists move from
skepticism to genuine enthusiasm for these systems. Moreover, we
clearly see that many of the very brightest young Ph.D.s coming out
of graduate school are very enthusiastic about this technology. Our
supercomputing capabilities have become a recruiting magnet.
I want to stress that supercomputing is only one of a number of
powerful new technologies that research-based pharmaceutical
companies are applying to their drug-discovery programs. But it's a
very powerful scientific tool -- a tool that will become all the more
powerful with networking capabilities.
- A supercomputer network will greatly facilitate the dynamic
collaboration among scientists at different locations -- often different
institutions. Lilly scientists are working with research groups at
universities and high technology companies around the world. A
national supercomputer network would greatly enhance the
effectiveness of joint efforts with our colleagues at the University of
Michigan or the University of Washington at Seattle, for example.
- A supercomputer network will help us optimize scarce
scientific talent during a period when we're almost certain to
experience major shortfalls in the availability of Ph.D.- level
scientists. I would go so far as to suggest that the visualization
capabilities of supercomputing may actually help attract more of the
best and the brightest into the sciences -- this at a time when key
industries in the U.S. economy desperately need such talent.
Finally, I can't overemphasize that a supercomputing network
will help scientists ask questions whose answers they could never
seriously pursue before. Tens of thousands of our best thinkers will
find applications for this technology that will totally outstrip any
predictions that we venture today. Supercomputing represents a
revolution. a new wave...a paradigm shift in the development of
modern technology.
In conclusion, I want to stress two points. We believe that
supercomputers and a national supercomputing network are
important to our company, to our industry, and to the medical
professionals and patients we serve. We believe that super-
computing will play a crucial role in many technology-based
industries and in the growth of national economies that depend on
these industries. Again, we strongly recommend the enactment of S.
272.
Thank you.